Streetwise Professor

September 18, 2018

He Blowed Up Real Good. And Inflicted Some Collateral Damage to Boot

I’m on my way back from my annual teaching sojourn in Geneva, plus a day in the Netherlands for a speaking engagement.  While I was taking that European non-quite-vacation, a Norwegian power trader, Einar Aas, suffered a massive loss in cleared spread trades between Nordic and German electricity.  The loss was so large that it blew through Aas’ initial margin and default fund contribution to the clearinghouse (Nasdaq), consumed Nasdaq’s €7 million capital contribution to the default fund, and €107 million of the rest of the default fund–a mere 66 percent of the fund.  The members have been ordered to contribute €100 million to top up the fund.

This was bound to happen. In a way, it was good that it happened in a relatively small market.  But it provides a sobering demonstration of what I’ve said for years: clearing doesn’t eliminate losses, but affects the distribution of losses.  Further, financial institutions that back CCPs–the members–are the ultimate backstops.  Thus, clearing does not eliminate contagion or interconnections in the financial network: it just changes the topology of the network, and the channels by which losses can hit the balance sheets of big players.

Happening in the Nordic/European power markets, this is an interesting curiosity.  If it happens in the interest rate or equity markets, it could be a disaster.

We actually know very little about what happened, beyond the broad details.  We know Aas was long Nordic power and short German power, and that the spread widened due to wet weather in Norway (which depresses the price of hydro and reduces demand) and an increase in European prices due to increases in CO2 prices.  But Nasdaq trades daily, weekly, monthly, quarterly, and annual power products: we don’t know which blew up Aas.  Daily spreads are more volatile, and exhibit more extremes (kurtosis), but since margins are scaled to risk (at least theoretically–more on this below) what matters is the market move relative to the estimated risk.  Reports indicate that the spread moved 17x the typical move, but we don’t know what measure of “typical” is used here.  Standard deviation?  Not a very good measure when there is a lot of kurtosis (or skewness).

I also haven’t seen how big Aas’ initial margins were.  The total loss he suffered was bigger than the hit taken by the default fund, because under the loser-pays model, the initial margins would have been in the first loss position.

The big question in my mind relates to Nasdaq’s margin model.  Power price distributions deviate substantially from the Gaussian, and estimating those distributions is challenging in part because they are also conditional on day of the year and hour of the day, and on fundamental supply-demand conditions: one model doesn’t fit every day, every hour, every season, or every weather enviornment.  Moreover, a spread trade has correlation risk–dependence risk would be a better word, given that correlation is a linear measure of dependence and dependencies in power prices are not linear.  How did Nasdaq model this dependence and how did that impact margins?

One possibility is that Nasdaq’s risk/margin model was good, but this was just one of those things.  Margins are set on the basis of the tails, and tail events occur with some probability.

Given the nature of the tails in power prices (and spreads) reliance on a VaR-type model would be especially dangerous here.  Setting margin based on something like expected shortfall would likely be superior here.  Which model does Nasdaq use?

I can also see the possibility that Nasdaq’s margin model was faulty, and that Aas had figured this out.  He then put on trades that he knew were undermargined because Nasdaq’s model was defective, which allowed him to take on more risk than Nasdaq intended.

In my early work on clearing I indicted that this adverse selection problem was a concern in clearing, and would lead CCPs–and those who believe that CCPs make the financial system safer–to underestimate risk and be falsely complacent.  Indeed, I argued that one reason clearing could be a bad idea is that it was more vulnerable to adverse selection problems because the need to model the distribution of gains/losses on cleared positions requires detailed knowledge, especially for more exotic products.  Traders who specialize in these products are likely to have MUCH better understanding about risks than a non-specialist CCP.

Aas cleared for himself, and this has caused some to get the vapors and conclude that Nasdaq was negligent in allowing him to do so.  Self-clearing is just an FCM with a house account, but with no client business: in some respects that’s less risky than a traditional FCM with client business as well as its own trading book.

Nasdaq required Aas to have €70 million in capital to self-clear.  Presumably Nasdaq will get some of that capital in an insolvency proceeding, and use it to repay default fund members–meaning that the €114 million loss is likely an overestimate of the ultimate cost borne by Nasdaq and the clearing members.

Further, that’s probably similar to the amount of capital that an FCM would have had to have to carry a client position as big as Aas’.   That’s not inherently more risky (to the clearinghouse and its default fund) than if Aas had cleared through another firm (or firms).  Again, the issue is whether Nasdaq is assessing risks accurately so as to allow it to set clearing member capital appropriately.

But the point is that Aas had to have skin in the game to self-clear, just as an FCM would have had to clear for him.

Holding Aas’ positions constant, whether he cleared himself or through an FCM really only affected the distribution of losses, but not the magnitude.  If Aas had cleared through someone else, that someone else’s capital would have taken the hit, and the default fund would have been at risk only if that FCM had defaulted.  But the total loss suffered by FCMs would have been exactly the same, just distributed more unevenly.

Indeed, the more even distribution that occurred due to mutualization which spread the default loss among multiple FCMs might actually be preferable to having one FCM bear the brunt.

The real issue here is incentives.  My statement was that holding Aas’ positions constant, who he cleared through or whether he cleared at all affected only the distribution of losses.  Perhaps under different structures Aas might not have been able to take on this much risk.  But that’s an open question.

If he had cleared through another FCM, that FCM would have had an incentive to limit its positions because its capital was at risk.  But Aas’ capital was at risk–he had skin in the game too, and this was necessary for him to self-clear.  It’s by no means obvious that an FCM would have arrived at a different conclusion than Aas, and decided that his position represented a reasonable risk to its capital.

Here again a key issue is information asymmetry: would the FCM know more about the risk of Aas’ position, or less?  Given Aas’ allegedly obsessive behavior, and his long-time success as a trader, I’m pretty sure that Aas knew more about the risk than any FCM would have, and that requiring him to clear through another firm would not have necessarily constrained his position.  He would have also had an incentive to put his business at the dumbest FCM.

Another incentive issue is Nasdaq’s skin in the game–an issue that has exercised FCMs generally, not just on Nasdaq.  The exchange’s/CCP’s relatively thin contribution to the default fund arguably reduces its incentive to get its margin model right.  Evaluating whether Nasdaq’s relatively minor exposure to default risk led it to undermargin requires a more thorough analysis of its margin model, which is a very complex exercise which is impossible to do given what we know about the model.

But this all brings me back to themes I flogged to the collective shrug of many–indeed almost all–of the regulatory and legislative community back in the aftermath of the Crisis, when clearing was the silver bullet for future crises.   Clearing is all about the allocation and pricing of counterparty credit risk.  Evaluation of counterparty credit risk in a derivatives context requires a detailed understanding of the price risks of the cleared products, and dependencies between these price risks and the balance sheet risks of participants in cleared markets.  Classic information problems–adverse selection and moral hazard (too little skin in the game)–make risk sharing costly, and can lead to the mispricing of risk.

The forensics about Aas blowing up real good, and the lessons learned from that experience, should focus on those issues.  Alas, I see little recognition of that in the media coverage of the episode, and betting on form, I would wager that the same is true of regulators as well.

The Aas blow up should be a salutary lesson in how clearing really works, what it can do, and what it can’t.   Cynic that I am, I’m guessing that it won’t be.  And if I’m right, the next time could be far, far worse.

Print Friendly, PDF & Email

August 28, 2018

Shed a Tear for Central Bankers Facing Obsolescence? Uhm, No. Jump for Joy.

Filed under: Economics,Financial crisis,History,Regulation — cpirrong @ 7:00 pm

Scott Sumner rightly skewers this central bankers’/macroeconomists’ angst:

That’s according to a paper presented Saturday by Harvard Business School economist Alberto Cavallo at the Federal Reserve Bank of Kansas City’s annual symposium in Jackson Hole, Wyoming.

Cavallo’s main finding was that competition from Amazon has led to a greater frequency of price changes at more traditional retailers like Walmart Inc., and also to more uniformity in pricing of the same items across different locations. He found that the shift has led to a greater influence of movements in the U.S. dollar exchange rate and gas prices on retail prices.

. . . .

The Cavallo study also showed that from 2008 to 2017, as online purchases accounted for an ever-growing share of total retail sales, the average duration of prices of goods sold at large U.S. retailers like Walmart fell from about 6.5 months to about 3.7 months.

The implications have subtle significance for monetary policy because so-called “sticky prices” — the notion that sellers aren’t able to change prices right away in response to changes in supply and demand — is precisely what gives interest rates power in mainstream models to have any effect on the economy at all. In those models, if prices adjust instantaneously in response to shocks, then there is no role for central bankers to guide supply and demand back into equilibrium.

“For monetary models and empirical work, my results suggest that the focus needs to move beyond traditional nominal rigidities,” Cavallo wrote. “Labor costs, limited information, and even ’decision costs’ (related to inattention and the limited capacity to process data) will tend to disappear as more retailers use algorithms to make pricing decisions.”

Come on.  The right response to Cavallo’s finding is NOT: “OH NOES! Monetary policy will be less effective when prices aren’t as sticky!”  The right response is: “Thank God we won’t need to rely on monetary policy–which can go horribly wrong because central bankers are humans operating with limited information and flawed theoretical understanding–to counteract shocks!”

Sticky prices create a potentially–and I emphasize potentially–beneficial role for monetary policy.  When prices are sticky, monetary shocks–including shocks to the demand for money–can have real effects.  Monetary authorities can in theory–and again I emphasize in theory–counteract these shocks and keep output closer to the optimum level.

However, the actual results often fall far short of the theoretical potential, because (as Sumner argues happened in 2007-2008, and Friedman and Schwartz argued happened regularly in US monetary history from 1867-1960) monetary authorities may misdiagnose economic conditions, and adopt a suboptimal policy, especially when they operate based on flawed heuristics, such as using the level of interest rates as a measure of whether monetary policy is tight or loose.

Thus, having more flexible pricing that allows nominal prices to adjust to shocks to the demand and supply of money makes us less reliant on central banking wizards–a very good thing, when they are often quite like the Wizard of Oz.

As Scott notes, more flexible/less-sticky prices do not eliminate the impact of monetary policy altogether, though for the most part that role should be less interventionist and more rule-based.

One nominal rigidity that more flexible goods prices won’t eliminate is that most debt will be denominated in nominal terms, and thus its real value will change with the prices of goods and services.  More flexible good prices may actually exacerbate the economic impact of nominal debt on real activity.  Although it is possible to imagine financial innovations that lead to more effective indexing or debt, whether the innovation is adopted widely remains to be seen, and there is room for doubt given the coordination issues involved.  Moreover.  there will still be a stock of existing nominal debt to work off even if new debt is indexed in more clever ways.

But even if nominal rigidities disappear, monetary shocks can still cause real fluctuations.  Remember that the Lucasian Rational Expectations models and their successors do not include rigid prices, yet they exhibit real responses to nominal shocks.  Indeed, that was the entire reason why Lucas and his contemporaries devised these models in the first place, as a way of resolving the Friedman conundrum: “money is a veil, but when the veil flutters, the economy stutters.”

In such a world, however, the role of central banks is much more limited.  In such a world, rule-based, rather than discretionary, policies that reduce the frequency and intensity of nominal shocks, are warranted.  That doesn’t leave much for central bankers to do.  They can’t be masters of the universe!

Central bankers no doubt look with dread on such a world.  That dread is implicit in the fretting over more flexible pricing reducing and perhaps eliminating the role of activist central bankers.  But their dread should be our joy.

 

Print Friendly, PDF & Email

May 8, 2018

Libor Was a Crappy Wrench. Here–Use This Beautiful New Hammer Instead!

Filed under: Derivatives,Economics,Exchanges,Financial crisis,Regulation — The Professor @ 8:02 pm

When discussing the 1864 election, Lincoln mused that it was unwise to swap horses in midstream.  (Lincoln used a variant of this phrase many times during the campaign.) The New York Fed and the Board of Governors are proposing to do that nonetheless when it comes to interest rates.  They want to transition from reliance on Libor to a new Secured Overnight Financing Rate (SOFR, because you can never have enough acronyms), despite the fact that there are trillions of dollars of notional in outstanding derivatives and more trillions in loans with payments tied to Libor.

There are at least two issues here.  The first is if Libor fades away, dies, or is murdered, what is to be done with the outstanding contracts that it is written into? Renegotiations of contracts (even if possible) would be intense, costly, and protracted, because any adjustment to contracts to replace Libor could result in the transfer of tens of billions of dollars among the parties to these contracts.  This is particularly like because of the stark differences between Libor and SOFR.  How would you value the difference between a stream of cash flows based on a flawed mechanism intended to reflect term rates on unsecured borrowings with a stream of cash flows based on overnight secured borrowings?  Apples to oranges doesn’t come close to describing the difference.

Seriously: how would you determine the value so that you could adjust contracts?  A conventional answer is to hold some sort of auction (such as that used to determine CDS payoffs in a default), and then settle all outstanding contracts based on the clearing price in the auction (again like a CDS auction).  But I can’t see how that would work here.

Let’s say you have a contract entitling you to receive a set of payoffs tied to Libor.  You participate in an auction where you bid an amount that you would be willing to pay/receive to give up that set of payoffs for a set of SOFR payoffs.  What would you bid?  Well, in a conventional auction your bid would be based on the value of holding onto the item you would give up (here, the Libor payments).  But if Libor is going to go away, how would you determine that opportunity cost?

Not to mention that there is an immense variety of payoff formulae based on Libor, meaning that there would have to be an immense variety of (impractical) auctions.

So it will come down to bruising negotiations, which given the amounts at stake, would consume large amounts of real resources.

The second issue is whether the SOFR rate will perform the same function as well as Libor did.  Market participants always had the choice to use some other rate to determine floating rates in swaps–T-bill rates, O/N repo rates, what have you.  They settled on Libor pretty quickly because Libor hedged the risks that swap users faced better than the alternatives.  A creditworthy bank that borrowed unsecured for 1, 3, 6, or 12 month terms could hedge its funding costs pretty well by using a Libor-based swap: a swap based on some alternative (like an O/N secured rate) would have been a dirtier hedge.  Similarly, another way that banks hedged interest rate risk was to lend at rates tied to their funding cost–which varied closely with Libor.  Well, the borrowers (e.g., corporates) could swap those floating rate loans into fixed by using Libor-based swaps.

That is, Libor-based swaps and other derivatives came to dominate because they were better hedges for interest rate risks faced by banks and corporates than alternatives would have been.  There was an element of reflexivity here too: the availability of Libor-based hedging instruments made it desirable to enter into borrowing and lending transactions based on Libor, because you could hedge them. This positive feedback mechanism created the vexing situation faced today, where there are immense sums of contracts that embed Libor in one way or another.

SOFR will not have this desirable feature–unless the Fed wants to drive banks to do all their funding secured overnight! That is, there will be a mismatch between the new rate that is intended replace Libor as a benchmark in derivatives and loan transactions, and the risks that that market participants want to hedge.

In essence, the Fed identified the problem with Libor–its vulnerability to manipulation because it was not based on transactions–and says that it has fixed it by creating a benchmark based on a lot of transactions.  The problem is that the benchmark that is “better” in some respects (less vulnerable to a certain kind of manipulation) is worse in others (matching the risk that market participants want to hedge).  In a near obsessive quest to fix one flaw, the Fed totally overlooked the purpose of the thing that they were trying to fix, and have created something of dubious utility because it does a poorer job of achieving that purpose.  In focusing on the details of the construction of the benchmark, they’ve lost sight of the big picture: what the benchmark is supposed to be used for.

It’s like the Fed has said: “Libor was one crappy wrench, so we’ve gone out and created this beautiful hammer. Use that instead!”

Or, to reprise an old standby, the Fed is like the drunk looking for his car keys under the lamppost, not because he lost them there, but because the light is better.  There is more light (transactions) in the O/N secured market, but that’s not where the market’s hedging keys are.

This is an object lesson in how governments and other large bureaucracies go astray.  The details of a particular problem receive outsized attention, and all efforts are focused on fixing that problem without considering the larger context, and the potential unintended consequences of the “fix.” Government is especially vulnerable to this given the tendency to focus on scandal and controversy and the inevitable narrative simplification and decontextualization that scandal creates.

The current ‘bor administrator–ICE–is striving to keep it alive.  These efforts deserve support.  Secured overnight rate-based benchmarks are ill-suited to serve as the basis for interest rate derivatives that are used to hedge the transactions that Libor-based derivatives do.

Print Friendly, PDF & Email

October 17, 2017

Financial Regulators Are Finally Grasping the Titanic’s Captain’s Mistake. That’s Something, Anyways

Filed under: Clearing,Commodities,Derivatives,Economics,Financial crisis,Regulation — The Professor @ 7:11 pm

A couple of big clearing stories this week.

First, Gary Cohn, Director of the National Economic Council (and ex-Goldmanite–if there is such a thing as “ex”, sorta like the Cheka), proclaimed that CCPs pose a systemic risk, and the move to clearing post-crisis has been overdone: “Like every great modern invention, it has its limits, and I think we have expanded the limits of clearing probably farther beyond their useful existence.” Now, Cohn’s remarks are somewhat Trump-like in their clarity (or lack thereof), but they seem to focus on one type of liquidity issue: “we get less transparency, we get less liquid assets in the clearinghouse, it does start to resonate to me to be a new systemic problem in the system,” and “It’s the things we can’t liquidate that scare me.”

So one interpretation of Cohn’s statement is that he is worried that as CCPs expand, perforce they end up expanding what they accept as collateral. During a crisis in particular, these dodgier assets become very difficult to sell to cover the obligations of a defaulter, putting the CCP at risk of failure.

Another interpretation of “less liquid assets” and “things we can’t liquidate” is that these expressions refer to the instruments being cleared. A default that leaves a CCP with an unmatched book of illiquid derivatives in a stressed market will have a difficult task in restoring that book, and is at greater risk of failure.

These are both serious issues, and I’m glad to see them being aired (finally!) at the upper echelons of policymakers. Of course, these do not exhaust the sources of systemic risk in CCPs. We are nearing the 30th anniversary of the 1987 Crash, which revealed to me in a very vivid, experiential way the havoc that frequent variation margining can wreak when prices move a lot. This is the most important liquidity risk inherent in central clearing–and in the mandatory variation margining of uncleared derivatives.

So although Cohn did not address all the systemic risk issues raised by mandatory clearing, it’s past time that somebody important raised the subject in a very public and dramatic way.

Commenter Highgamma asked me whether this was from my lips to Cohn’s ear. Well, since I’ve been sounding the alarm for over nine years (with my first post-crisis post on the subject appearing 3 days after Lehman), all I can say is that sound travels very slowly in DC–or common sense does, anyways.

The other big clearing story is that the CFTC gave all three major clearinghouses passing grades on their just-completed liquidity stress tests: “All of the clearing houses demonstrated the ability to generate sufficient liquidity to fulfill settlement obligations on time.” This relates to the first interpretation of Cohn’s remarks, namely, that in the event that a CCP had to liquidate defaulters’ (plural) collateral in order to pay out daily settlements to this with gains, it would be able to do so.

I admit to being something of a stress test skeptic, especially when it comes to liquidity. Liquidity is a non-linear thing. There are a lot of dependencies that are hard to model. In a stress test, you look at some extreme scenarios, but those scenarios represent a small number of draws from a radically uncertain set of possibilities (some of which you probably can’t even imagine). The things that actually happen are usually way different than what you game out. And given the non-linearities and dependencies, I am skeptical that you can be confident in how liquidity will play out in the scenarios you choose.

Further, as I noted above, this problem is only one of the liquidity concerns raised by clearing, and not necessarily the the biggest one. But the fact that the CFTC is taking at least some liquidity issues seriously is a good thing.

The Gensler-era CFTC, and most of the US and European post-crisis financial regulators, imagined that the good ship CCP was unsinkable, and accordingly steered a reckless course heedless to any warning. You know, sort of like the captain of the Titanic did–and that is a recipe for disaster. Fortunately, now there is a growing recognition in policy-making circles that there are indeed financial icebergs out there that could sink clearinghouses–and take much of the financial system down with them. That is definitely an advance. There is still a long way to go, and methinks that policymakers are still to sanguine about CCPs, and still too blasé about the risks that lurk beneath the surface. But it’s something.

Print Friendly, PDF & Email

October 12, 2017

Trump Treasury Channels SWP

SWP doesn’t work for the Trump Treasury Department, and is in fact neuralgic to the idea of working for any government agency. Yet the Treasury’s recent report on financial regulatory reform is very congenial to my thinking, on derivatives related issues anyways. (I haven’t delved into the other portions.)

A few of the greatest hits.

Position limits. The Report expresses skepticism about the existence of “excessive speculation.” Therefore, it recommends limiting the role of position limits to reducing manipulation during the delivery period. Along those lines, it recommends spot month on limits, because that is “where the risk of manipulation is greatest.” It also says that limits should be designed so as to not burden unduly hedgers. I made both of these points in my 2011 comment letter on position limits, and in the paper submitted in conjunction with ISDA’s comment letter in 2014. They are also reflected in the report on the deliberations of the Energy and Environmental Markets Advisory Committee that I penned (to accurately represent the consensus of the Committee) in 2016–much to Lizzie Warren’s chagrin.

The one problematic recommendation is that spot month position limits be based on “holistic” definitions of deliverable supply–e.g., the world gold market. This could have extremely mischievous effects in manipulation litigation: such expansive and economically illogical notions of deliverable supplies in CFTC decisions like Cox & Frey make it difficult to prosecute corners and squeezes.

CFTC-SEC Merger. I have ridiculed this idea for literally decades–starting when I was yet but a babe in arms 😉 It is a hardy perennial in DC, which I have called a solution in search of a problem. (I think I used the same language in regards to position limits–this is apparently a common thing in DC.) The Treasury thinks little of the idea either, and recommends against it.

SEFs. I called the SEF mandate “the worst of Frankendodd” immediately upon the passage of the law in July, 2010. The Treasury Report identifies many of the flaws I did, and recommends a much less restrictive requirement than GiGi imposed in the CFTC SEF rules. I also called out the Made Available For Trade rule the dumbest part of the worst of Frankendodd, and Treasury recommends eliminating these flaws as well. Finally, four years ago I blogged about the insanity of the dueling footnotes, and Treasury recommends “clarifying or eliminating” footnote 88, which threatened to greatly expand the scope of the SEF mandate.

CCPs. Although it does not address the main concern I have about the clearing mandate, Treasury does note that many issues regarding systemic risks relating to CCPs remain unresolved. I’ve been on about this since before DFA was passed, warning that the supposed solution to systemic risk originating in derivatives markets created its own risks.

Uncleared swap margin. I’ve written that uncleared swap margin rules were too rigid and posed risks. I have specifically written about the 10-day margining period rule as being too crude and poorly calibrated to risk: Treasury agrees. Similarly, it argues for easing affiliate margin rules, reducing the rigidity of the timing of margin payments (which will ease liquidity burdens), and overbroad application of the rule to include entities that do not impose systemic risks.

De minimis threshold for swap dealers. I’m on the record for saying using a notional amount to determine the de minimis threshold to determine who must register as a swap dealer made no sense, given the wide variation in riskiness of different swaps of the same notional value. I also am on the record that the $8 billion threshold sweeps in firms that do not pose systemic risks, and that a reduced threshold of $3 billion would be even more ridiculously over inclusive. Treasury largely agrees.

The impact of capital rules on clearing. One concern I’ve raised is that various capital rules, in particular those that include initial margin amounts in determining liquidity ratios for banks, and hence their capital requirements, make no economic sense, and and unnecessarily drive up the costs banks/FCMs incur to clear for clients. This is contrary to the purpose of clearing mandates, and moreover, has contributed to increased concentration among FCMs, which is in itself a systemic risk. Treasury recommends “the deduction of initial margin for centrally cleared derivatives from the SLR denominator.” Hear, hear.

I could go into more detail, but these are the biggies. All of these recommendations are very sensible, and with the one exception noted above, in the Title VII-related section I see no non-sensical recommendations. This is actually a very thoughtful piece of work that if followed, will  undo some of the most gratuitously burdensome parts of Frankendodd, and the Gensler CFTC’s embodiment (or attempts to embody) those parts in rules.

But, of course, on the Lizzie Warren left and in the chin pulling mainstream media, the report is viewed as a call to gut essential regulations. Gutting stupid is actually a good idea, and that’s what this report proposes. Alas, Lizzie et al are incapable of even conceiving that regulations could possibly be stupid.

Hamstrung by inane Russia investigations and a recalcitrant (and largely gutless and incompetent) Republican House and Senate, the Trump administration has accomplished basically zero on the legislative front. It’s only real achievement so far is to start–and just to start–the rationalization and in some cases termination (with extreme prejudice) of Obama-era regulation. If implemented, the recommendations in the Treasury Report (at least insofar as Title VII of DFA is concerned), would represent a real achievement. (As would rollbacks or elimination of the Clean Power Plan, Net Neutrality, and other 2009-2016 inanity.)

But of course this will require painstaking efforts by regulatory agencies, and will have to be accomplished in the face of an unrelentingly hostile media and the lawfare efforts of the regulatory class. But at least the administration has laid out a cogent plan of action, and is getting people in place who are dedicated to put that plan into action (e.g., Chris Giancarlo at CFTC). So let’s get on with it.

 

 

 

Print Friendly, PDF & Email

July 6, 2017

SWP Acid Flashback, CCP Edition

Filed under: Clearing,Derivatives,Economics,Financial crisis,Regulation — The Professor @ 6:09 pm

Sometimes reading current news about clearing specifically and post-crisis regulation generally triggers acid flashbacks to old blog posts. Like this one (from 2010!):

[Gensler’s] latest gurgling appears on the oped page of today’s WSJ.  It starts with a non-sequitur, and careens downhill from there.  Gensler tells a story about his role in the LTCM situation, and then claims that to prevent a recurrence, or a repeat of AIG, it is necessary to reduce the “cancerous interconnections” (Jeremiah Recycled Bad Metaphor Alert!) in the financial system by, you guessed it, mandatory clearing.

Look.  This is very basic.  Do I have to repeat it?  CLEARING DOES NOT ELIMINATE INTERCONNECTIONS AMONG FINANCIAL INSTITUTIONS.  At most, it reconfigures the topology of the network of interconnections.  Anyone who argues otherwise is not competent to weigh in on the subject, let alone to have regulatory responsibility over a vastly expanded clearing system.  At most you can argue that the interconnections in a cleared system are better in some ways than the interconnections in the current OTC structure.  But Gensler doesn’t do that.   He just makes unsupported assertion after unsupported assertion.

Jeremiah’s latest gurgling appears on the oped page of today’s WSJ.  It starts with a non-sequitur, and careens downhill from there.  Gensler tells a story about his role in the LTCM situation, and then claims that to prevent a recurrence, or a repeat of AIG, it is necessary to reduce the “cancerous interconnections” (Jeremiah Recycled Bad Metaphor Alert!) in the financial system by, you guessed it, mandatory clearing. Look.  This is very basic.  Do I have to repeat it?  CLEARING DOES NOT ELIMINATE INTERCONNECTIONS AMONG FINANCIAL INSTITUTIONS.  At most, it reconfigures the topology of the network of interconnections.  Anyone who argues otherwise is not competent to weigh in on the subject, let alone to have regulatory responsibility over a vastly expanded clearing system.  At most you can argue that the interconnections in a cleared system are better in some ways than the interconnections in the current OTC structure.  But Gensler doesn’t do that.   He just makes unsupported assertion after unsupported assertion.

So what triggered this flashback? This recent FSB (no! not Putin!)/BIS/IOSCO report on . . . wait for it . . . interdependencies in clearing. As summarized by Reuters:

The Financial Stability Board, the Committee on Payments and Market Infrastructures, the International Organization of Securities Commissioners and the Basel Committee on Banking Supervision, also raised new concerns around the interdependency of CCPs, which have become crucial financial infrastructures as a result of post-crisis reforms that forced much of the US$483trn over-the-counter derivatives market into central clearing.

In a study of 26 CCPs across 15 jurisdictions, the committees found that many clearinghouses maintain relationships with the same financial entities.

Concentration is high with 88% of financial resources, including initial margin and default funds, sitting in just 10 CCPs. Of the 307 clearing members included in the analysis, the largest 20 accounted for 75% of financial resources provided to CCPs.

More than 80% of the CCPs surveyed were exposed to at least 10 global systemically important financial institutions, the study showed.

In an analysis of the contagion effect of clearing member defaults, the study found that more than half of surveyed CCPs would suffer a default of at least two clearing members as a result of two clearing member defaults at another CCP.

This suggests a high degree of interconnectedness among the central clearing system’s largest and most significant clearing members,” the committees said in their analysis.

To reiterate: as I said in 2010 (and the blog post echoed remarks that I made at ISDA’s General Meeting in San Fransisco shortly before I wrote the post), clearing just reconfigures the topology of the network. It does not eliminate “cancerous interconnections”. It merely re-jiggers the connections.

Look at some of the network charts in the FSB/BIS/IOSCO report. They are pretty much indistinguishable from the sccaaarrry charts of interdependencies in OTC derivatives that were bruited about to scare the chillin into supporting clearing and collateral mandates.

The concentration of clearing members is particularly concerning. The report does not mention it, but this concentration creates other major headaches, such as the difficulties of porting positions if a big clearing member (or two) defaults. And the difficulties this concentration would produce in trying to auction off or hedge the positions of the big clearing firms.

Further, the report understates the degree of interconnections, and in fact ignores some of the most dangerous ones. It looks only at direct connections, but the indirect connections are probably more . . . what’s the word I’m looking for? . . . cancerous–yeahthat’s it. CCPs are deeply embedded in the liquidity supply and credit network, which connects all major (and most minor) players in the market. Market shocks that cause big price changes in turn cause big variation margin calls that reverberate throughout the entire financial system. Given the tight coupling of the liquidity system generally, and the particularly tight coupling of the margining mechanism specifically, this form of interconnection–not considered in the report–is most laden with systemic ramifications. As I’ve said ad nauseum: the connections that are intended to prevent CCPs from failing are exactly the ones that pose the greatest threat to the entire system.

To flash back to another of my past writings: this recent report, when compared to what Gensler said in 2010 (and others, notably Timmy!, were singing from the same hymnal), shows that clearing and collateral mandates were a bill of goods. These mandates were sold on the basis of lies large and small. And the biggest lie–and I said so at the time–was that clearing would reduce the interconnectivity of the financial system. So the FSB/BIS/IOSCO have called bullshit on Gary Gensler. Unfortunately, seven years too late.

 

Print Friendly, PDF & Email

July 1, 2017

All Flaws Great and Small, Frankendodd Edition

On Wednesday I had the privilege to deliver the keynote at the FOW Trading Chicago event. My theme was the fundamental flaws in Frankendodd–you’re shocked, I’m sure.

What I attempted to do was to categorize the errors. I identified four basic types.

Unintended consequences contrary to the objectives of DFA. This could also be called “counter-intended consequences”–not just unintended, but the precise opposite of the stated intent. The biggest example is, well, related to bigness. If you wanted to summarize a primary objective of DFA, it would be “to reduce the too big to fail problem.” Well, the very nature of DFA means that in some ways it exacerbates TBTF. Most notably, the resulting regulatory burdens actually favor scale, because they impose largely fixed costs. I didn’t mention this in my talk, but a related effect is that increasing regulation leads to greater influence activities by the regulated, and for a variety of reasons this tends to favor the big over the medium and small.

Perhaps the most telling example of the perverse effects of DFA is that it has dramatically increased concentration among FCMs. This exacerbates a variety of sources of systemic risk, including concentration risk at CCPs; difficulties in managing defaulted positions and porting the positions of the customers of troubled FCMs; and greater interconnections across CCPs. Concentration also fundamentally undermines the ability of CCPs to mutualize default risk. It can also create wrong-way risks as the big FCMs are in some cases also sources of liquidity support to CCPs.

I could go on.

Creation of new risks due to misdiagnoses of old risks. The most telling example here is the clearing and collateral mandates, which were predicated on the view that too much credit was extended via OTC derivatives transactions. Collateral and netting were expected to reduce this credit risk.

This is a category error. For one thing, it embodies a fallacy of composition: reducing credit in one piece of an interconnected financial system that possesses numerous ways to create credit exposures does not necessarily reduce credit risk in the system as a whole. For another, even to the extent that reducing credit extended via derivatives transactions reduces overall credit exposures in the financial system, it does so by creating another risk–liquidity risk. This risk is in my view more pernicious for many reasons. One reason is that it is inherently wrong-way in nature: the mandates increase demands for liquidity precisely during those periods in which liquidity supply typically contracts. Another is that it increases the tightness of coupling in the financial system. Tight coupling increases the risk of catastrophic failure, and makes the system more vulnerable to a variety of different disruptions (e.g., operational risks such as the temporary failure of a part of the payments system).

As the Clearing Cassandra I warned about this early and often, to little avail–and indeed, often to derision and scorn. Belatedly regulators are coming to an understanding of the importance of this issue. Fed governor Jerome Powell recently emphasized this issue in a speech, and recommended CCPs engage in liquidity stress testing. In a scathing report, the CFTC Inspector General criticized the agency’s cost-benefit analysis of its margin rules for non-cleared swaps, based largely on its failure to consider liquidity effects. (The IG report generously cited my work several times.

But these are at best palliatives. The fundamental problem is inherent in the super-sizing of clearing and margining, and that problem is here to stay.

Imposition of “solutions” to non-existent problems. The best examples of this are the SEF mandate and position limits. The mode of execution of OTC swaps was not a source of systemic risk, and was not problematic even for reasons unrelated to systemic risk. Mandating a change to the freely-chosen modes of transaction execution has imposed compliance costs, and has also resulted in a fragmented swaps market: those who can escape the mandate (e.g., European banks trading € swaps) have done so, leading to bifurcation of the market for € swaps, which (a) reduces competition (another counter-intended consequence), and (b) reduces liquidity (also counter-intended).

The non-existence of a problem that position limits could solve is best illustrated by the pathetically flimsy justification for the rule set out in the CFTC’s proposal: the main example the CFTC mentioned is the Hunt silver episode. As I said during my talk, this is ancient history: when do we get to the Trojan War? If anything, the Hunts are the exception that proves the rule. The CFTC also pointed to Amaranth, but (a) failed to show that Amaranth’s activities caused “unreasonable and unwarranted price fluctuations,” and (b) did not demonstrate that (unlike the Hunt case) that Amaranth’s financial distress posed any threat to the broader market or any systemic risk.

It is sickly amusing that the CFTC touts that based on historical data, the proposed limits would constrain few, if any market participants. In other words, an entire industry must bear the burden of complying with a rule that the CFTC itself says would seldom be binding. Makes total sense, and surely passes a rigorous cost-benefit test! Constraining positions is unlikely to affect materially the likelihood of “unreasonable and unwarranted price fluctuations”. Regardless, positions are not likely to be constrained. Meaning that the probability that the regulation reduces such price fluctuations is close to zero, if not exactly equal to zero. Yet there would be an onerous, and ongoing cost to compliance. Not to mention that when the regulation would in fact bind, it would potentially constrain efficient risk transfer.

The “comma and footnote” problem. Such a long and dense piece of legislation, and the long and detailed regulations that it has spawned, inevitably contain problems that can lead to protracted disputes, and/or unpleasant surprises. The comma I refer to is in the position limit language of the DFA itself: as noted in the court decision that stymied the original CFTC position limit rule, the placement of the comma affects whether the language in the statute requires the CFTC to impose limits, or merely gives it the discretionary authority to do so in the even that it makes an explicit finding that the limits are required to reduce unwarranted and unreasonable price fluctuations. The footnotes I am thinking of were in the SEF rule: footnote 88 dramatically increased the scope of the rule, while footnote 513 circumscribed it.

And new issues of this sort crop up regularly, almost 7 years after the passage of Dodd-Frank. Recently Risk highlighted the fact that in its proposal for capital requirements on swap dealers, the CFTC (inadvertently?) potentially made it far more costly for companies like BP and Shell to become swap dealers. Specifically, whereas the Fed defines a financial company as one in which more than 85 percent of its activities are financial in nature, the CFTC proposes that a company can take advantage of more favorable capital requirements if its financial activities are less than 15 percent of its overall activities. Meaning, for example, a company with 80 percent financial activity would not count as a financial company under Fed rules, but would under the proposed CFTC rule. This basically makes it impossible for predominately commodity companies like BP and Shell to take advantage of preferential capital treatment specifically included for them and their ilk in DFA. To the extent that these firms decide to incur costs (higher capital costs, or the cost of reorganizing their businesses to escape the rule’s bite) and become swap dealers nonetheless, that cost will not generate any benefit. To the extent that they decide that it is not worth the cost, the swaps market will be more concentrated and less competitive (more counter-intended effects).

The position limits proposed regs provide a further example of this devil-in-the-details problem. The idea of a hedging carveout is eminently sensible, but the specifics of the CFTC’s hedging exemptions were unduly restrictive.

I could probably add more categories to the list. Different taxonomies are possible. But I think the foregoing is a useful way of thinking about the fundamental flaws in Frankendodd.

I’ll close with something that could make you feel better–or worse! For all the flaws in Frankendodd, MiFID II and EMIR make it look like a model of legislative and regulatory wisdom. The Europeans have managed to make errors in all of these categories–only more of them, and more egregious ones. For instance, as bad as the the US position limit proposal is, it pales in comparison to the position limit regulations that the Europeans are poised to inflict on their firms and their markets.

 

Print Friendly, PDF & Email

May 6, 2017

Son of Glass-Steagall: A Nostrum, Prescribed by Trump

Filed under: Economics,Financial crisis,History,Politics,Regulation — The Professor @ 7:30 pm

Apologies for the posting hiatus. I was cleaning out my mother’s house in preparation for her forthcoming move, a task that vies with the Labors of Hercules. I intended to post, but I was just too damn tired at the end of each day.

I’ll ease back into things by giving a heads up on my latest piece in The Hill, in which I argue that reviving Glass-Steagall’s separation of commercial and investment banking is a solution in search of a problem. One thing that I find telling is that the problem the original was intended to address in the 1930s was totally different than the one that is intended to address today. Further, the circumstances in the 1930s were wildly different from present conditions.

In the 1930s, the separation was intended to prevent banks from fobbing off bad commercial and sovereign loans to unwitting investors through securities underwriting. This problem in fact did not exist: extensive empirical evidence has shown that debt securities underwritten by universal banks (like J.P. Morgan) were of higher quality and performed better ex post than debt underwritten by stand alone investment banks. Further, the  most acute problem of the US banking system was not too big to fail, but too small to succeed. The banking crisis of the 1930s was directly attributable to the fragmented nature of the US banking system, and the proliferation of thousands of small, poorly diversified, thinly capitalized banks. The bigger national banks, and in particular the universal ones, were not the problem in 1932-33. Further, as Friedman-Schwartz showed long ago, a blundering Fed implemented policies that were fatal to such a rickety system.

In contrast, today’s issue is TBTF. But, as I note in The Hill piece, and have written here on occasion, Glass-Steagall separation would not have prevented the financial crisis. The institutions that failed were either standalone investment banks, GSE’s, insurance companies involved in non-traditional insurance activities, or S&Ls. Universal banks that were shaky (Citi, Wachovia) were undermined by traditional lending activities. Wachovia, for instance, was heavily exposed to mortgage lending through its acquisition of a big S&L (Golden West Financial). There was no vector of contagion between the investment banking activities and the stability of any large universal bank.

As I say in The Hill, whenever the same prescription is given for wildly different diseases, it’s almost certainly a nostrum, rather than a cure.

Which puts me at odds with Donald Trump, for he is prescribing this nostrum. Perhaps in an effort to bring more clicks to my oped, the Monday after it appeared Trump endorsed a Glass-Steagall revival. This was vintage Trump. You can see his classic MO. He has a vague idea about a problem–TBTF. Not having thought deeply about it, he seizes upon a policy served up by one of his advisors (in this case, Gary Cohn, ex-Goldman–which would benefit from a GS revival), and throws it out there without much consideration.

The main bright spot in the Trump presidency has been his regulatory rollback, in part because this is one area in which he has some unilateral authority. Although I agree generally with this policy, I am under no illusions that it rests on deep intellectual foundations. His support of Son of Glass-Steagall shows this, and illustrates that no one (including Putin!) should expect an intellectually consistent (or even coherent) policy approach. His is, and will be, an instinctual presidency. Sometimes his instincts will be good. Sometimes they will be bad. Sometimes his instincts will be completely contradictory–and the call for a return to a very old school regulation in the midst of a largely deregulatory presidency shows that quite clearly.

 

Print Friendly, PDF & Email

February 14, 2017

“First, Kill All the Economists!” Sounds Great to Some, But It Won’t Fix Monetary Policy

Filed under: Economics,Financial crisis,Financial Crisis II,History,Regulation — The Professor @ 9:00 pm

A former advisor to the Dallas Fed has penned a book blasting the Fed for being ruled by a “tribe” of insular egghead economics PhDs:

In her book, Ms. Booth describes a tribe of slow-moving Fed economists who dismiss those without high-level academic credentials. She counts Fed Chairwoman Janet Yellen and former Fed leader Ben Bernanke among them. The Fed, Mr. Bernanke and the Dallas Fed declined to comment.

The Fed’s “modus operandi” is defined by “hubris and myopia,” Ms. Booth writes in an advance copy of the book. “Central bankers have invited politicians to abdicate leadership authority to an inbred society of PhD academics who are infected to their core with groupthink, or as I prefer to think of it: ‘groupstink.’”

“Global systemic risk has been exponentially amplified by the Fed’s actions,” Ms. Booth writes, referring to the central bank’s policies holding interest rates very low since late 2008. “Who will pay when this credit bubble bursts? The poor and middle class, not the elites.”

Ms. Booth is an acolyte of her former boss, Dallas Fed chair Richard Fisher, who said “If you rely entirely on theory, you are not going to conduct the right policy, because policies have consequences.”

I have very mixed feelings about this. There is no doubt that under the guidance of academics, including (but not limited to) Ben Bernanke, that the Fed has made some grievous errors. But it is a false choice to claim that Practical People can do better without a coherent theoretical framework. For what is the alternative to theory? Heuristics? Rules of thumb? Experience?

Two thinkers usually in conflict–Keynes and Hayek– were of of one mind on this issue. Keynes famously wrote:

Practical men who believe themselves to be quite exempt from any intellectual influence, are usually the slaves of some defunct economist. Madmen in authority, who hear voices in the air, are distilling their frenzy from some academic scribbler of a few years back.

For his part, Hayek said “without a theory the facts are silent.”

Everybody–academic economist or no–is beholden to some theory or another. It is a conceit of non-academics to believe that they are “exempt from any intellectual influence.” Indeed, the advantage of following an explicit theoretical framework is that its assumptions and implications are transparent and (usually) testable, and therefore can be analyzed, challenged, and improved. An inchoate and largely informal “practical” mindset (which often is a hodgepodge of condensed academic theories) is far more amorphous and difficult to understand or challenge. (Talk to a trader about monetary policy sometime if you doubt me.)

Indeed, Ms. Booth gives evidence of this. Many have been prophesying doom as a result of the Fed’s (and the ECB’s) post-2008 policies: Ms. Booth is among them. I will confess to have harbored such concerns, and indeed, challenged Ben Bernanke on this at a Fed conference on Jekyll Island in May, 2009. It may happen sometime, and I believe that ZIRP has indeed distorted the economy, but my fears (and Ms. Booth’s) have not been realized in eight plus years.

Ms. Booth’s critique of pre-crisis Fed policy is also predicated on a particular theoretical viewpoint, namely, that the Fed fueled a credit bubble prior to the Crash. But as scholars as diverse as Scott Sumner and John Taylor have argued, Fed policy was actually too tight prior to the crisis.

Along these lines, one could argue that the Fed’s most egregious errors are not the consequence of deep DSGE theorizing, but instead result from the use of rules of thumb and a failure to apply basic economics. As Scott Sumner never tires of saying (and sadly, must keep repeating because those who are slaves to the rule of thumb are hard of hearing and learning) the near universal practice of using interest rates as a measure of the state of monetary policy is a category error: befitting a Chicago trained economist, Scott cautions never argue from a price change, but look for the fundamental supply and demand forces that cause a price (e.g., an interest rate to be high or low). (As a Chicago guy, I have been beating the same drum for more than 30 years.)

And some historical perspective is in order. The Fed’s history is a litany of fumbles, some relatively minor, others egregious. Blame for the Great Depression and the Great Inflation can be laid directly at the Fed’s feet. Its most notorious failings were not driven by the prevailing academic fashion, but occurred under the leadership of practical people, mainly people with a banking background,  who did quite good impressions of madmen in authority. Ms. Booth bewails the “hubris of Ph.D. economists who’ve never worked on the Street or in the City,” but people who have worked there have screwed up monetary policy when they’ve been in charge.

As tempting as it may sound, “First, kill all the economists!” is not a prescription for better monetary policy. Economists may succumb to hubris (present company excepted, of course!) but the real hubris is rooted in the belief that central banks can overcome the knowledge problem, and can somehow manage entire economies (and the stability of the financial system). Hayek pointedly noted the “fatal conceit” of central planning. That conceit is inherent in central banking, too, and is not limited to professionally trained economists. Indeed, I would venture that academics are less vulnerable to it.

The problem, therefore, is not who captains the monetary ship. The question is whether anyone is capable of keeping such a huge and unwieldy vessel off the shoals. Experience–and theory!–suggests no.

 

Print Friendly, PDF & Email

February 4, 2017

The Regulatory Road to Hell

One of the most encouraging aspects of the new administration is its apparent commitment to rollback a good deal of regulation. Pretty much the entire gamut of regulation is under examination, and even Trump’s nominee for the Supreme Court, Neil Gorsuch, represents a threat to the administrative state due to his criticism of Chevron Deference (under which federal courts are loath to question the substance of regulations issued by US agencies).

The coverage of the impending regulatory rollback is less that informative, however. Virtually every story about a regulation under threat frames the issue around the regulation’s intent. The Fiduciary Rule “requires financial advisers to act in the best interests of their clients.” The Stream Protection Rule prevents companies from “dumping mining waste into streams and waterways.” The SEC rule on reporting of payments to foreign governments by energy and minerals firms “aim[s] to address the ‘resource curse,’ in which oil and mineral wealth in resource-rich countries flows to government officials and the upper classes, rather than to low-income people.” Dodd-Frank is intended prevent another financial crisis. And on and on.

Who could be against any of these things, right? This sort of framing therefore makes those questioning the regulations out to be ogres, or worse, favoring financial skullduggery, rampant pollution, bribery and corruption, and reckless behavior that threatens the entire economy.

But as the old saying goes, the road to hell is paved with good intentions, and that is definitely true of regulation. Regulations often have unintended consequences–many of which are directly contrary to the stated intent. Furthermore, regulations entail costs as well as benefits, and just focusing on the benefits gives a completely warped understanding of the desirability of a regulation.

Take Frankendodd. It is bursting with unintended consequences. Most notably, quite predictably (and predicted here, early and often) the huge increase in regulatory overhead actually favors consolidation in the financial sector, and reinforces the TBTF problem. It also has been devastating to smaller community banks.

DFA also works at cross purposes. Consider the interaction between the leverage ratio, which is intended to insure that banks are sufficiently capitalized, and the clearing mandate, which is intended to reduce systemic risk arising from the derivatives markets. The interpretation of the leverage ratio (notably, treating customer margins held by FCMs as an FCM asset which increases the amount of capital it must hold due to the leverage ratio) makes offering clearing services more expensive. This is exacerbating the marked consolidation among FCMs, which is contrary to the stated purpose of Dodd-Frank. Moreover, it means that some customers will not be able to find clearing firms, or will find using derivatives to manage risk prohibitively expensive. This undermines the ability of the derivatives markets to allocate risk efficiently.

Therefore, to describe regulations by their intentions, rather than their effects, is highly misleading. Many of the effects are unintended, and directly contrary to the explicit intent.

One of the effects of regulation is that they impose costs, both direct and indirect.  A realistic appraisal of regulation requires a thorough evaluation of both benefits and costs. Such evaluations are almost completely lacking in the media coverage, except to cite some industry source complaining about the cost burden. But in the context of most articles, this comes off as special pleading, and therefore suspect.

Unfortunately, much cost benefit analysis–especially that carried out by the regulatory agencies themselves–is a bad joke. Indeed, since the agencies in question often have an institutional or ideological interest in their regulations, their “analyses” should be treated as a form of special pleading of little more reliability than the complaints of the regulated. The proposed position limits regulation provides one good example of this. Costs are defined extremely narrowly, benefits very broadly. Indirect impacts are almost completely ignored.

As another example, Tyler Cowen takes a look into the risible cost benefit analysis behind the Stream Protection Rule, and finds it seriously wanting. Even though he is sympathetic to the goals of the regulation, and even to the largely tacit but very real meta-intent (reducing the use of coal in order to advance  the climate change agenda), he is repelled by the shoddiness of the analysis.

Most agency cost benefit analysis is analogous to asking pupils to grade their own work, and gosh darn it, wouldn’t you know, everybody’s an A student!

This is particularly problematic under Chevron Deference, because courts seldom evaluate the substance of the regulations or the regulators’ analyses. There is no real judicial check and balance on regulators.

The metastasizing regulatory and administrative state is a very real threat to economic prosperity and growth, and to individual freedom. The lazy habit of describing regulations and regulators by their intent, rather than their effects, shields them from the skeptical scrutiny that they deserve, and facilitates this dangerous growth. If the Trump administration and Congress proceed with their stated plans to pare back the Obama administration’s myriad and massive regulatory expansion, this intent-focused coverage will be one of the biggest obstacles that they will face.  The media is the regulators’ most reliable paving contractor  for the highway to hell.

Print Friendly, PDF & Email

Next Page »

Powered by WordPress