Streetwise Professor

April 4, 2017

The Unintended Consequences of Blockchain Are Not Unpredictable: Respond Now Rather Than Repent Later*

Filed under: Clearing,Commodities,Derivatives,Economics,Regulation — The Professor @ 3:39 pm

In the past week the WSJ and the FT have run articles about a new bank-led initiative to move commodity trading onto a blockchain. In many ways, this makes great sense. By its nature, the process of recording and trading commodity trades and shipments is (a) collectively involve large numbers of spatially dispersed counterparties, (b) have myriad terms, and (c) can give rise to costly disputes. As a result of these factors, the process is currently very labor intensive, fraught with operational risk (e.g., inadvertent errors) and vulnerable to fraud (cf., the Qingdao metals warehouse scandal of 2014). In theory, blockchain has the ability to reduce costs, errors, and fraud. Thus, it is understandable that traders and banks are quite keen on the potential of blockchain to reduce costs and perhaps even revolutionize the trading business.

But before you get too excited, a remark by my friend Christophe Salmon at Trafigura is latent with deep implications that should lead you to take pause and consider the likely consequences of widespread adoption of blockchain:

Christophe Salmon, Trafigura’s chief financial officer, said there would need to be widespread adoption by major oil traders and refiners to make blockchain in commodity trading viable in the long term.

This seemingly commonsense and innocuous remark is actually laden with implications of unintended consequences that should be recognized and considered now, before the blockchain train gets too far down the track.

In essence, Christophe’s remark means that to be viable blockchain has to scale. If it doesn’t scale, it won’t reduce cost. But if it does scale, a blockchain for a particular application is likely to be a natural monopoly, or at most a natural duopoly. (Issues of scope economies are also potentially relevant, but I’ll defer discussion of that for now.)

Indeed, if there are no technical impediments to scaling (which in itself is an open question–note the block size debate in Bitcoin), the “widespread adoption” feature that Christophe identifies as essential means that network effects create scale economies that are likely to result in the dominance of a single platform. Traders will want to record their business on the blockchain that their counterparties use. Since many trade with many, this creates a centripetal force that will tend to draw everyone to a single blockchain.

I can hear you say: “Well, if there is a public blockchain, that happens automatically because everyone has access to it.” But the nature of public blockchain means that it faces extreme obstacles that make it wildly impractical for commercial adoption on the scale being considered not just in commodity markets, but in virtually every aspect of the financial markets. Commercial blockchains will be centrally governed, limited access, private systems rather than a radically decentralized, open access, commons.

The “forking problem” alone is a difficulty. As demonstrated by Bitcoin in 2013 and Ethereum in 2016, public blockchains based on open source are vulnerable to “forking,” whereby uncoordinated changes in the software (inevitable in an open source system that lacks central governance and coordination) result in the simultaneous existence of multiple, parallel blockchains. Such forking would destroy the network economy/scale effects that make the idea of a single database attractive to commercial participants.

Prevention of forking requires central governance to coordinate changes in the code–something that offends the anarcho-libertarian spirits who view blockchain as a totally decentralized mechanism.

Other aspects of the pure version of an open, public blockchain make it inappropriate for most financial and commercial applications. For instance, public blockchain is touted because it does not require trust in the reputation of large entities such as clearing networks or exchanges. But the ability to operate without trust does not come for free.

Trust and reputation are indeed costly: as Becker and Stigler first noted decades ago, and others have formalized since, reputation is a bonding mechanism that requires the trusted entity to incur sunk costs that would be lost if it violates trust. (Alternatively, the trusted entity has to have market power–which is costly–that generates a stream of rents that is lost when trust is violated. That is, to secure trust prices have to be higher and output lower than would be necessary in a zero transactions cost world.)

But public blockchains have not been able to eliminate trust without cost. In Bitcoin, trust is replaced with “proof of work.” Well, work means cost. The blockchain mining industry consumes vast amounts of electricity and computing power in order to prove work. It is highly likely that the cost of creating trusted entities is lower than the cost of proof of work or alternative ways of eliminating the need for trust. Thus, a (natural monopoly) commercial blockchain is likely to have to be a trusted centralized institution, rather than a decentralized anarchist’s wet-dream.

Blockchain is also touted as permitting “smart contracts,” which automatically execute certain actions when certain pre-defined (and coded) contingencies are met. But “smart contracts” is not a synonym for “complete contracts,” i.e., contracts where every possible contingency is anticipated, and each party’s actions under each contingency is specified. Thus, even with smart (but incomplete) contracts, there will inevitably arise unanticipated contingencies.

Parties will have to negotiate what to do under these contingencies. Given that this will usually be a bilateral bargaining situation under asymmetric information, the bargaining will be costly and sometimes negotiations will break down. Moreover, under some contingencies the smart contracts will automatically execute actions that the parties do not expect and would like to change: here, self-execution prevents such contractual revisions, or at least makes them very difficult.

Indeed, it may be the execution of the contractual feature that first makes the parties aware that something has gone horribly wrong. Here another touted feature of pure blockchain–immutability–can become a problem. The revelation of information ex post may lead market participants to desire to change the terms of their contract. Can’t do that if the contracts are immutable.

Paper and ink contracts are inherently incomplete too, and this is why there are centralized mechanisms to address incompleteness. These include courts, but also, historically, bodies like stock or commodity exchanges, or merchants’ associations (in diamonds, for instance) have helped adjudicate disputes and to re-do deals that turn out to be inefficient ex post. The existence of institutions to facilitate the efficient adaption of parties to contractual incompleteness demonstrates that in the real world, man does not live (or transact) by contract alone.

Thus, the benefits of a mechanism for adjudicating and responding to contractual incompleteness create another reason for a centralized authority for blockchain, even–or especially–blockchains with smart contracts.

Further, the blockchain (especially with smart contracts) will be a complex interconnected system, in the technical sense of the term. There will be myriad possible interactions between individual transactions recorded on the system, and these interactions can lead to highly undesirable, and entirely unpredictable, outcomes. A centralized authority can greatly facilitate the response to such crises. (Indeed, years ago I posited this as one of the reasons for integration of exchanges and clearinghouses.)

And the connections are not only within a particular blockchain. There will be connections between blockchains, and between a blockchain and other parts of the financial system. Consider for example smart contracts that in a particular contingency dictate large cash flows (e.g., margin calls) from one group of participants to another. This will lead to a liquidity shock that will affect banks, funding markets, and liquidity supply mechanisms more broadly. Since the shock can be destabilizing and lead to actions that are individually rational but systemically destructive if uncoordinated, central coordination can improve efficiency and reduce the likelihood of a systemic crisis. That’s not possible with a radically decentralized blockchain.

I could go on, but you get the point: there are several compelling reasons for centralized governance of a commercial blockchain like that envisioned for commodity trading. Indeed, many of the features that attract blockchain devotees are bugs–and extremely nasty ones–in commercial applications, especially if adopted at large scale as is being contemplated. As one individual who works on commercializing blockchain told me: “Commercial applications of blockchain will strip out all of the features that the anarchists love about it.”

So step back for a minute. Christophe’s point about “widespread adoption” and an understanding of the network economies inherent in the financial and commercial applications of blockchain means that it is likely to be a natural monopoly in a particular application (e.g., physical oil trading) and likely across applications due to economies of scope (which plausibly exist because major market participants will transact in multiple segments, and because of the ability to use common coding across different applications, to name just two factors). Second, a totally decentralized, open access, public blockchain has numerous disadvantages in large-scale commercial applications: central governance creates value.

Therefore, commercial blockchains will be “permissioned” in the lingo of the business. That is, unlike public blockchain, entry will be limited to privileged members and their customers. Moreover, the privileged members will govern and control the centralized entity. It will be a private club, not a public commons. (And note that even the Bitcoin blockchain is not ungoverned. Everyone is equal, but the big miners–and there are now a relatively small number of big miners–are more equal than others. The Iron Law of Oligarchy applies in blockchain too.)

Now add another factor: the natural monopoly blockchain will likely not be contestible, for reasons very similar to the ones I have written about for years to demonstrate why futures and equity exchanges are typically natural monopolies that earn large rents because they are largely immune from competitive entry. Once a particular blockchain gets critical mass, there will be the lock-in problem from hell: a coordinated movement of a large set of users from the incumbent to a competitor will be necessary for the entrant to achieve the scale necessary to compete. This is difficult, if not impossible to arrange. Three Finger Brown could count the number of times that has happened in futures trading on his bad hand.

Now do you understand why banks are so keen on the blockchain? Yes, they couch it in terms of improving transactional efficiency, and it does that. But it also presents the opportunity to create monopoly financial market infrastructures that are immune from competitive entry. The past 50 years have seen an erosion of bank dominance–“disintermediation”–that has also eroded their rents. Blockchain gives the empire a chance to strike back. A coalition of banks (and note that most blockchain initiatives are driven by a bank-led cooperative, sometimes in partnership with a technology provider or providers) can form a blockchain for a particular application or applications, exploit the centripetal force arising from network effects, and gain a natural monopoly largely immune from competitive entry. Great work if you can get it. And believe me, the banks are trying. Very hard.

Left to develop on its own, therefore, the blockchain ecosystem will evolve to look like the exchange ecosystem of the 19th or early-20th centuries. Monopoly coalitions of intermediaries–“clubs” or “cartels”–offering transactional services, with member governance, and with the members reaping economic rents.

Right now regulators are focused on the technology, and (like many others) seem to be smitten with the potential of the technology to reduce certain costs and risks. They really need to look ahead and consider the market structure implications of that technology. Just as the natural monopoly nature of exchanges eventually led to intense disputes over the distribution of the benefits that they created, which in turn led to regulation (after bitter political battles), the fundamental economics of blockchain are likely to result in similar conflicts.

The law and regulation of blockchain is likely to be complicated and controversial precisely because natural monopoly regulation is inherently complicated and controversial. The yin and yang of financial infrastructure in particular is that the technology likely makes monopoly efficient, but also creates the potential for the exercise of market power (and, I might add, the exercise of political power to support and sustain market power, and to influence the distribution of rents that result from that market power). Better to think about those things now when things are still developing, than when the monopolies are developed, operating, and entrenched–and can influence the political and regulatory process, as monopolies are wont to do.

The digital economy is driven by network effects: think Google, Facebook, Amazon, and even Twitter. In addition to creating new efficiencies, these dominant platforms create serious challenges for competition, as scholars like Ariel Ezrachi and Maurice Stucke have shown:

Peter Thiel, the successful venture capitalist, famously noted that ‘Competition Is for Losers.’ That useful phrase captures the essence of many technology markets. Markets in which the winner of the competitive process is able to cement its position and protect it. Using data-driven network effects, it can undermine new entry attempts. Using deep pockets and the nowcasting radar, the dominant firm can purchase disruptive innovators.

Our new economy enables the winners to capture much more of the welfare. They are able to affect downstream competition as well as upstream providers. Often, they can do so with limited resistance from governmental agencies, as power in the online economy is not always easily captured using traditional competition analysis. Digital personal assistants, as we explore, have the potential to strengthen the winner’s gatekeeper power.

Blockchain will do the exact same thing.

You’ve been warned.

*My understanding of these issues has benefited greatly from many conversations over the past year with Izabella Kaminska, who saw through the hype well before pretty much anyone. Any errors herein are of course mine.

Print Friendly

March 27, 2017

Seeing the OTC Derivatives Markets (and the Financial Markets) Like a State

Filed under: Clearing,Derivatives,Economics,Regulation — The Professor @ 12:07 pm

In the years since the financial crisis, and in particular the period preceding and immediately following the passage of Frankendodd, I can’t tell you how many times I saw diagrams that looked like this:

YellenCCPDiagram_1

YellenCCPDiagram_2

The top diagram is a schematic representation of an OTC derivatives market, with a tangle of bilateral connections between counterparties. The second is a picture of a hub-and-spoke trading network with a CCP serving as the hub. (These particular versions of this comparison are from a 2013 Janet Yellen speech.)

These diagrams came to mind when re-reading James Scott’s Seeing Like a State and his Two Cheers for Anarchism. Scott argues that states have an obsession with making the societies they rule over “legible” in order to make them easier to tax, regulate, and control. States are confounded by evolved complexity and emergent orders: such systems are difficult to comprehend, and what cannot be comprehended cannot be easily ruled. So states attempt to impose schemes to simplify such complex orders. Examples that Scott gives include standardization of language and suppression of dialects; standardization of land tenure, measurements, and property rights; cadastral censuses; population censuses; the imposition of familial names; and urban renewal (e.g., Hausmann’s/Napoleon III’s massive reconstruction of Paris). These things make a populace easier to tax, conscript, and control.

Complex realities of emergent orders are too difficult to map. So states conceive of a mental map that is legible to them, and then impose rules on society to force it to conform with this mental map.

Looking back at the debate over OTC markets generally, and clearing, centralized execution, and trade reporting in particular, it is clear that legislators and regulators (including central banks) found these markets to be illegible. Figures like the first one–which are themselves a greatly simplified representation of OTC reality–were bewildering and disturbing to them. The second figure was much more comprehensible, and much more comforting: not just because they could comprehend it better, but because it gave them the sense that they could impose an order that would be easier to monitor and control. The emergent order was frightening in its wildness: the sense of imposing order and control was deeply comforting.

But as Scott notes, attempts to impose control on emergent orders (which in Scott’s books include both social and natural orders, e.g., forests) themselves carry great risks because although hard to comprehend, these orders evolved the way they did for a reason, and the parts interact in poorly understood–and sometimes completely not understood–ways. Attempts to make reality fit a simple mental map can cause the system to react in unpredicted and unpredictable ways, many of which are perverse.

My criticism of the attempts to “reform” OTC markets was largely predicated on my view that the regulators’ simple mental maps did great violence to complex reality. Even though these “reform” efforts were framed as ways of reducing systemic risk, they were fatally flawed because they were profoundly unsystemic in their understanding of the financial system. My critique focused specifically on the confident assertions based on the diagrams presented above. By focusing only on the OTC derivatives market, and ignoring the myriad connections of this market to other parts of the financial market, regulators could not have possibly comprehended the systemic implications of what they were doing. Indeed, even the portrayal of the OTC market alone was comically simplistic. The fallacy of composition played a role here too: the regulators thought they could reform the system piece-by-piece, without thinking seriously about how these pieces interacted in non-linear ways.

The regulators were guilty of the hubris illustrated beautifully by the parable of Chesterton’s Fence:

In the matter of reforming things, as distinct from deforming them, there is one plain and simple principle; a principle which will probably be called a paradox. There exists in such a case a certain institution or law; let us say, for the sake of simplicity, a fence or gate erected across a road. The more modern type of reformer goes gaily up to it and says, “I don’t see the use of this; let us clear it away.” To which the more intelligent type of reformer will do well to answer: “If you don’t see the use of it, I certainly won’t let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to destroy it.

In other words, the regulators should have understood the system and why it evolved the way that it did before leaping in to “reform” it. As Chesterton says, such attempts at reformation quite frequently result in deformation.

Somewhat belatedly, there are efforts underway to map the financial system more accurately. The work of Richard Bookstaber and various colleagues under the auspices of the Office of Financial Research to create multilayer maps of the financial system is certainly a vast improvement on the childish stick figure depictions of Janet Yellen, Gary Gensler, Timmy Geithner, Chris Dodd, Barney Frank et al. But even these more sophisticated maps are extreme abstractions, not least because they cannot capture incentives, the distribution of information among myriad market participants, and the motivations and behaviors of these participants. Think of embedding these maps in the most complicated extensive form large-N player game you can imagine, and you might have some inkling of how inadequate any schematic representation of the financial system is likely to be. When you combine this with the fact that in complex systems, even slight changes in initial conditions can result in completely different outcomes, the futility of “seeing like a state” in this context becomes apparent. The map of initial conditions is inevitably crude, making it an unreliable guide to understanding the system’s future behavior.

In my view, Scott goes too far. There is no doubt that some state-driven standardization has dramatically reduced transactions costs and opened up new possibilities for wealth-enhancing exchanges (at some cost, yes, but these costs are almost certainly less than the benefit), but Scott looks askance at virtually all such interventions. Thus, I do not exclude the possibility of true reform. But Scott’s warning about the dangers of forcing complex emergent orders to conform to simplified, “legible”, mental constructs must be taken seriously, and should inform any attempt to intervene in something like the financial system. Alas, this did not happen when legislators and regulators embarked on their crusade to reorganize wholesale the world financial system. It is frightening indeed to contemplate that this crusade was guided by such crude mental maps such as those supposedly illustrating the virtues of moving from an emergent bilateral OTC market to a tamed hub-and-spoke cleared one.

PS. I was very disappointed by this presentation by James Scott. He comes off as a doctrinaire leftist anthropologist (but I repeat myself), which is definitely not the case in his books. Indeed, the juxtaposition of Chesterton and Scott shows how deeply conservative Scott is (in the literal sense of the word).

Print Friendly

February 11, 2017

Risk Gosplan Works Its Magic in Swaps Clearing

Filed under: Clearing,Commodities,Derivatives,Economics,Politics,Regulation — The Professor @ 4:18 pm

Deutsche Bank quite considerately provided a real time example of an unintended consequence of Frankendodd, specifically, capital requirements causing firms to exit from clearing. The bank announced it is continuing to provide futures clearing, but is exiting US swaps clearing, due to capital cost concerns.

Deutsch was not specific in citing the treatment of margins under the leverage ratio as the reason for its exit, this is the most likely culprit. Recall that even segregated margins (which a bank has no access to) are treated as bank assets under the leverage rule, so a swaps clearer must hold capital against assets over which it has no control (because all swap margins are segregated), cannot utilize to fund its own activities, and which are not funded by a liability issued by the clearer.

It’s perverse, and is emblematic of the mixed signals in Frankendodd: CLEAR SWAPS! CLEARING SWAPS  IS EXTREMELY CAPITAL INTENSIVE SO YOU WON’T MAKE ANY MONEY DOING IT! Yeah. That will work out swell.

Of course Deutsch Bank has its own issues, and because of those issues it faces more acute capital concerns than other institutions (especially American ones). But here is a case where the capital cost does not at all match up with risk (and remember that capital is intended to be a risk absorber). So looking for ways to economize on capital, Deutsch exited a business where the capital charge did not generate any commensurate return, and furthermore was unrelated to the actual risk of the business. If the pricing of risk had been more sensible, Deutsch might have scaled back other businesses where capital charges reflected risk more accurately. Here, the effect of the leverage ratio is all pain, no gain.

When interviewed by Risk Magazine about the Fundamental Review of the Trading Book, I said: “The FRTB’s standardised approach is basically central planning of risk pricing, and it will produce Gosplan-like results.” The leverage ratio, especially as applied to swaps margins, is another example of central planning of risk pricing, and here indeed it has produced Gosplan-like results.

And in the case of clearing, these results are exactly contrary to a crucial ostensible purpose of DFA: reducing size and concentration in banking generally, and in derivatives markets in particular. For as the FT notes:

The bank’s exit will reignite concerns that the swaps clearing business is too concentrated among a handful of large players. The top three swaps clearers account for more than half the market by client collateral required, while the top five account for over 75 per cent.

So swaps clearing is now hyper-concentrated, and dominated by a handful of systemically important banks (e.g., Citi, Goldman). It is more concentrated that the bilateral swaps dealer market was. Trouble at one of these dominant swaps clearers would create serious risks for CCPs that they clear for (which, by the way, are all interconnected because the same clearing members dominate all the major CCPs). Moreover, concentration dramatically reduces the benefits of mutualizing risk: because of the small number of clearers, the risk of a big CM failure will be borne by a small number of firms. This isn’t insurance in any meaningful way, and does not achieve the benefits of risk pooling even if only in the first instance only a single big clearing member runs into trouble due to a shock idiosyncratic to it.

At present, there is much gnashing of teeth and rending of garments at the prospect of even tweaks in Dodd-Frank. Evidently, the clearing mandate is not even on the table. But this one vignette demonstrates that Frankendodd and banking regulation generally is shot through with provisions intended to reduce systemic risk which do not have that effect, and indeed, likely have the perverse effect of creating some systemic risks. Viewing Dodd-Frank as a sacred cow and any proposed change to it as a threat to the financial system is utterly wrongheaded, and will lead to bad outcomes.

Barney and Chris did not come down Mount Sinai with tablets containing commandments written by the finger of God. They sat on Capitol Hill and churned out hundreds of pages of laws based on a cartoonish understanding of the financial system, information provided by highly interested parties, and a frequently false narrative of the financial crisis. These laws, in turn, have spawned thousands of pages of regulation, good, bad, and very ugly. What is happening in swaps clearing is very ugly indeed, and provides a great example of how major portions of Dodd-Frank and the regulations emanating from it need a thorough review and in some cases a major overhaul.

And if Elizabeth Warren loses her water over this: (a) so what else is new? and (b) good! Her Manichean view of financial regulation is a major impediment to getting the regulation right. What is happening in swaps clearing is a perfect illustration of why a major midcourse correction in the trajectory of financial regulation is imperative.

Print Friendly

February 4, 2017

The Regulatory Road to Hell

One of the most encouraging aspects of the new administration is its apparent commitment to rollback a good deal of regulation. Pretty much the entire gamut of regulation is under examination, and even Trump’s nominee for the Supreme Court, Neil Gorsuch, represents a threat to the administrative state due to his criticism of Chevron Deference (under which federal courts are loath to question the substance of regulations issued by US agencies).

The coverage of the impending regulatory rollback is less that informative, however. Virtually every story about a regulation under threat frames the issue around the regulation’s intent. The Fiduciary Rule “requires financial advisers to act in the best interests of their clients.” The Stream Protection Rule prevents companies from “dumping mining waste into streams and waterways.” The SEC rule on reporting of payments to foreign governments by energy and minerals firms “aim[s] to address the ‘resource curse,’ in which oil and mineral wealth in resource-rich countries flows to government officials and the upper classes, rather than to low-income people.” Dodd-Frank is intended prevent another financial crisis. And on and on.

Who could be against any of these things, right? This sort of framing therefore makes those questioning the regulations out to be ogres, or worse, favoring financial skullduggery, rampant pollution, bribery and corruption, and reckless behavior that threatens the entire economy.

But as the old saying goes, the road to hell is paved with good intentions, and that is definitely true of regulation. Regulations often have unintended consequences–many of which are directly contrary to the stated intent. Furthermore, regulations entail costs as well as benefits, and just focusing on the benefits gives a completely warped understanding of the desirability of a regulation.

Take Frankendodd. It is bursting with unintended consequences. Most notably, quite predictably (and predicted here, early and often) the huge increase in regulatory overhead actually favors consolidation in the financial sector, and reinforces the TBTF problem. It also has been devastating to smaller community banks.

DFA also works at cross purposes. Consider the interaction between the leverage ratio, which is intended to insure that banks are sufficiently capitalized, and the clearing mandate, which is intended to reduce systemic risk arising from the derivatives markets. The interpretation of the leverage ratio (notably, treating customer margins held by FCMs as an FCM asset which increases the amount of capital it must hold due to the leverage ratio) makes offering clearing services more expensive. This is exacerbating the marked consolidation among FCMs, which is contrary to the stated purpose of Dodd-Frank. Moreover, it means that some customers will not be able to find clearing firms, or will find using derivatives to manage risk prohibitively expensive. This undermines the ability of the derivatives markets to allocate risk efficiently.

Therefore, to describe regulations by their intentions, rather than their effects, is highly misleading. Many of the effects are unintended, and directly contrary to the explicit intent.

One of the effects of regulation is that they impose costs, both direct and indirect.  A realistic appraisal of regulation requires a thorough evaluation of both benefits and costs. Such evaluations are almost completely lacking in the media coverage, except to cite some industry source complaining about the cost burden. But in the context of most articles, this comes off as special pleading, and therefore suspect.

Unfortunately, much cost benefit analysis–especially that carried out by the regulatory agencies themselves–is a bad joke. Indeed, since the agencies in question often have an institutional or ideological interest in their regulations, their “analyses” should be treated as a form of special pleading of little more reliability than the complaints of the regulated. The proposed position limits regulation provides one good example of this. Costs are defined extremely narrowly, benefits very broadly. Indirect impacts are almost completely ignored.

As another example, Tyler Cowen takes a look into the risible cost benefit analysis behind the Stream Protection Rule, and finds it seriously wanting. Even though he is sympathetic to the goals of the regulation, and even to the largely tacit but very real meta-intent (reducing the use of coal in order to advance  the climate change agenda), he is repelled by the shoddiness of the analysis.

Most agency cost benefit analysis is analogous to asking pupils to grade their own work, and gosh darn it, wouldn’t you know, everybody’s an A student!

This is particularly problematic under Chevron Deference, because courts seldom evaluate the substance of the regulations or the regulators’ analyses. There is no real judicial check and balance on regulators.

The metastasizing regulatory and administrative state is a very real threat to economic prosperity and growth, and to individual freedom. The lazy habit of describing regulations and regulators by their intent, rather than their effects, shields them from the skeptical scrutiny that they deserve, and facilitates this dangerous growth. If the Trump administration and Congress proceed with their stated plans to pare back the Obama administration’s myriad and massive regulatory expansion, this intent-focused coverage will be one of the biggest obstacles that they will face.  The media is the regulators’ most reliable paving contractor  for the highway to hell.

Print Friendly

December 30, 2016

For Whom the (Trading) Bell Tolls

Filed under: Clearing,Commodities,Derivatives,Economics,Energy,Exchanges,History — The Professor @ 7:40 pm

It tolls for the NYMEX floor, which went dark for the final time with the close of trading today. It follows all the other New York futures exchange floors which ICE closed in 2012. This leaves the CME and CBOE floors in Chicago, and the NYSE floor, all of which are shadows of shadows of their former selves.

Next week I will participate in a conference in Chicago. I’ll be talking about clearing, but one of the other speakers will discuss regulating latency arbitrage in the electronic markets that displaced the floors. In some ways, all the hyperventilating over latency arbitrages due to speed advantages measured in microseconds and milliseconds in computerized markets is amusing, because the floors were all about latency arbitrage. Latency arbitrage basically means that some traders have a time and space advantage, and that’s what the floors provided to those who traded there. Why else would traders pay hundreds of thousands of dollars to buy a membership? Because that price capitalized the rent that the marginal trader obtained by being on the floor, and seeing prices and order flow before anybody off the floor did. That was the price of the time and space advantage of being on the floor.  It’s no different than co-location. Not in the least. It’s just meatware co-lo, rather than hardware co-lo.

In a paper written around 2001 or 2002, “Upstairs, Downstairs”, I presented a model predicting that electronic trading would largely annihilate time and space advantages, and that liquidity would improve as a result because it would reduce the cost of off-floor traders to offer liquidity. The latter implication has certainly been borne out. And although time and space differences still exist, I would argue that they pale in comparison to those that existed in the floor era. Ironically, however, complaints about fairness seem more heated and pronounced now than they did during the heyday of the floors.  Perhaps that’s because machines and quant geeks are less sympathetic figures than colorful floor traders. Perhaps it’s because being beaten by a sliver of a second is more infuriating than being pipped by many seconds by some guy screaming and waving on the CBT or NYMEX. Dunno for sure, but I do find the obsessing over HFT time and space advantages today to be somewhat amusing, given the differences that existed in the “good old days” of floor trading.

This is not to say that no one complained about the advantages of floor traders, and how they exploited them. I vividly recall a very famous trader (one of the most famous, actually) telling me that he welcomed electronic trading because he was “tired of being fucked by the floor.” (He had made his reputation, and his first many millions on the floor, by the way.) A few years later he bemoaned how unfair the electronic markets were, because HFT firms could react faster than he could.

It will always be so, regardless of the technology.

All that said, the passing of the floors does deserve a moment of silence–another irony, given their cacophony.

I first saw the NYMEX floor in 1992, when it was still at the World Trade Center, along with the floors of the other NY exchanges (COMEX; Coffee, Sugar & Cocoa; Cotton). That space was the location for the climax of the plot of the iconic futures market movie, Trading Places. Serendipitously, that was the movie that Izabella Kaminska of FT Alphaville featured in the most recent Alphachat movie review episode. I was a guest on the show, and discussed the economic, sociological, and anthropological aspects of the floor, as well as some of the broader social issues lurking behind the film’s comedy. You can listen here.

 

Print Friendly

December 16, 2016

Clearinghouse Resilience and Liquidity Black Holes

Filed under: Clearing,Commodities,Derivatives,Economics,Politics,Regulation — The Professor @ 5:11 pm

About six weeks ago I wrote a post on the strains put on clearing by Brexit. This informative post by Clarus’ Tod Skarecky provides some very interesting detail about the mechanics of the LCH’s margining mechanism.

One way to summarize it is to say that the LCH was a liquidity black hole. Not only did it collect intra-day and end-of-day variation margin from losers that was paid out to winners only with a delay, it also collected Market Data Runs, which were effectively intra-day initial margin top-ups. A couple of perverse features. First, a position that initially had a loss that triggered an MDR outflow had to pay out, but if the market turned in its favor intra-day, it didn’t get that money back until the following day. Second, a firm that had a loss that triggered an MDR outflow had to pay out, and if the position incurred a loss on the day, it still had to pay variation margin, and didn’t receive the MDR back until the next day: that is, there was”double dipping.”

Tod puts his figure on the logic (crucially, the logic from LCH’s perspective): “Heck if I managed credit risk at a firm, I’d always choose to be paid now rather than later.” Definitely. That minimizes credit risk. But look at how much liquidity was sucked up in order to do this.

Variation margin is bad enough: despite the (laughable) claim of the BIS some years back, the fact that variation margin is recycled does not mean that it does not create liquidity strains. After all, (a) liquidity demand arises due in large part to differences in timing between the receipt of cash and the payment thereof, and the clearing mechanism (in which the CCP pays out VM some hours after it receives VM) creates such timing differences, and (b) even absent payment timing differences, the VM receivers would have to lend to the VM payers, which is problematic especially during stressed market conditions. But the LCH IM top up exacerbates the problem because the cash is stuck in the clearinghouse overnight, and therefore cannot possibly be recirculated. More liquidity becomes less accessible.

Again, this is understandable from LCH’s microprudential perspective: it reduces the likelihood that it will become insolvent or illiquid. But just because this is sensible from a microprudential perspective does not mean it is macroprudentially sensible. In fact, it is anything but sensible: it greatly adds to liquidity demand, particularly during periods of time when liquidity is likely to be scarce, and when liquidity freezes are a serious risk.

This is a perfect example of the “levee effect” I’ve written about for years: raising the levee around the LCH increases the chances of its survival, but just redirects the stresses to elsewhere in the system.

Note the irony here. Clearing mandates were sold on the idea that there were pervasive externalities in uncleared derivatives markets, due primarily to the potential for default cascades in these markets. But clearing (supersized by mandates, in particular) creates externalities too. Here LCH does things that are in its interest, but which impose costs on others. It has a contractual relationship with some of these (FCMs), so there is some potential that externalities involving these parties can be mitigated through negotiation and changing contracts. But there are myriad parties not in privity of contract with LCH, and which LCH may not even know of, who are impacted, perhaps severely, by a liquidity shock exacerbated by LCH’s self-preserving actions.

In other words, clearing mandates don’t internalize all externalities. They create them too. And given the severe dangers of liquidity crises, the liquidity externality that clearing creates is particularly troubling.

Outgoing CFTC Chairman Timothy Massad says, don’t worry, be happy!:

Brexit’s Impact on Clearing Activity

Let’s first look at the impact on clearing activity. It’s important to remember first that clearinghouses mark all products to market every day, and require that participants with market losses post margin every day, sometimes more than once a day. Margin payments must be paid promptly because for every payment made to the clearinghouse, the clearinghouse must make a payment to another participant who has gains. The clearinghouse always has a balanced or “matched” book.

Even though margins were increased in advance of the vote, the volatility resulted in very large margin calls on June 24.

Clearing members paid $27 billion dollars in variation margin across the five largest clearinghouses registered with the CFTC. This was $22 billion dollars greater than the previous 12-month average—over five times larger. The good news is no one missed a payment, no one defaulted.

Supervisory Stress Tests

The results after Brexit confirmed what we recently found in our own internal testing: resilience in the face of stressful conditions. Last month, CFTC staff released a report detailing the results of a series of stress tests we performed on the five largest clearinghouses under our jurisdiction, which are located in the U.S. and the UK. Our tests assessed the impact of stressful market scenarios across these clearinghouses as well as their clearing members, many of whom are affiliates of the world’s largest banks.

We developed a set of 11 extreme but plausible scenarios based on a number of factors, including historical price changes on dates when there was extreme volatility. By comparison, our assumed price shocks were several times larger than what happened after Brexit. We applied these scenarios to actual positions as of a specific date. And we looked at whether the pre-funded resources held by the clearinghouse—in particular, the initial margin and guaranty fund amounts paid by clearing members as well as the clearinghouse’s capital—were sufficient to cover any losses.

Still not getting it. The discussion of stress tests essentially repeats the same mantra as LCH: it is a decidedly microprudential treatment that focuses on credit risk, not liquidity risk. The discussion of margins is perfunctory, despite the fact that this is what gave market participants serious worries on Brexit Day. No discussion of what extraordinary efforts were required to ensure that all payments were made. No discussion of whether this would have been possible during a bigger–and unanticipated–price shock. No discussion of the liquidity externalities. No discussion of what would happen if operational difficulties (e.g., a technology problem in the payments system like the failure of FedWire on 10/19/87) interfered with the completion of payments. (More payments increases the likelihood that such an operational failure will jeopardize the ability of FCMs to complete them. And a failure to meet a call triggers a default.)

This “what? Me worry?” approach sounds so . . . 2006. And it is exactly this kind of complacency that makes me worry. The nature of the liquidity issue still has not penetrated many regulatory skulls.

This is most likely due to a severe case of target fixation. Clearing mandates were motivated by a desire to reduce credit risk, and all efforts have been focused on that. That is the target that regulators are fixated on, and in the pursuit of that target their field of vision has narrowed, with liquidity risk being largely outside it. It is obviously the target that CCPs are focused on. This is why I take little comfort in the belated efforts to make CCPs more resilient. The recipe for resilience is to demand MOAR LIQUIDITY. Which is also the recipe for a broader market crisis.

Analogous to the dangers of high powered incentives with multi-tasking when some activities can be measured more accurately than others, the mandate to reduce derivatives credit risk has led regulators and market participants–particularly market utilities like CCPs–to devote excessive effort to mitigating credit risk, even though it exacerbates liquidity risk.

I doubt the clearing portions of Title VII of Frankendodd will be eliminated altogether, but the incoming administration should seriously consider a major re-evaluation to determine how to address the serious liquidity issues that clearing mandates create.

Print Friendly

October 31, 2016

A Brexit Horror Story That Demonstrates the Dangers of Clearing Mandates

Filed under: Clearing,Derivatives,Economics,Regulation — The Professor @ 12:43 pm

When I give my class on the systemic risks of clearing, I usually joke that I should give the lecture by a campfire, with a flashlight held under my chin. It is therefore appropriate that on this Halloween Risk published Peter Madigan’s take on the effects of Brexiton derivatives clearing: it is a horror story.

Since the clearing mandate was a gleam in Barney Frank’s eye (yes, a scary mental image–so it fits in the theme of the post!) I have warned that the most frightening thing about clearing and clearing mandates is that they transform credit risk into liquidity risk, and that liquidity risk is more systemically threatening than credit risk. This view was born of experience, slightly before Halloween in 1987, when I witnessed the near death experience that the CME clearinghouse, BOTCC, and OCC faced on Black Monday and the following Tuesday. The huge variation margin calls put a tremendous strain on liquidity, and operational issues (notably the shutdown of the FedWire) and the reluctance of banks to extend credit to FCMs and customers needing to meet margin calls came perilously close to causing the CCPs to fail.

The exchange CCPs were pipsqueaks by comparison to what we have today. The clearing mandates have supersized the clearing system, and commensurately increased the amount of liquidity needed to meet margin calls. The experience in the aftermath of the surprise Brexit vote illustrates just how dangerous this is.

As a result of Brexit, US Treasuries rallied by 32bp. The accompanying move in swap yields resulted in huge intra-day margin calls by multiple CCPs (LCH, CME, and Eurex). Madigan estimates that these calls totaled $25-$40 billion, and that some individual banks were asked to pony up multiple billions to meet margin calls from multiple CCPs. And to illustrate another thing I’ve been on about for years, they had to come up with the money in 60 minutes: failure to do so would have resulted in default. This provides a harrowing example of how tightly coupled the system is.

Some other crucial details. Much of the additional margin was to top up initial margin, meaning that the cash was sucked into the CCPs and kept there, rather than paid out to the net gainers, where it could have been recirculated. (Not that recirculating it would have been a panacea. Timing differences between flows of VM into and out of CCPs creates a need for liquidity. Moreover, recirculation by extension of credit is often problematic during periods of market stress, as that’s exactly when those who have liquidity are most likely to hoard it.)

Second, each CCP acted independently and called margin to protect its own interests. With multiple CCPs, there is a non-cooperative game between them. Each has an incentive to demand margin to protect itself, and to demand it before other CCPs do. The equilibrium in this game is inefficient because there is an externality between CCPs, and between CCPs and those who must meet the calls. This is ironic, because one of the alleged justifications for clearing mandates was the externalities present in the OTC derivatives markets. This is another example of how problems have been transformed, rather than truly banished.

This also illustrates another danger that I’ve pointed out for some time: building the levies high around CCPs just forces the floodwaters somewhere else.

Although there were some fraught moments for the banks who needed to stump up the cash on June 24, there were no defaults. But consider this. As I point out in the Risk article, Brexit was a known event and a known risk, and the banks had planned for it. Events like the October ’87 Crash or the September ’98 LTCM crisis are bolts from the blue. How will the system endure a surprise shock–especially one that could well be far larger than the Brexit move?

Horror stories are sometimes harmless ways to communicate real risks. Perhaps the Brexit event will be educational. Churchill once said that “Nothing in life is so exhilarating as to be shot at without result.” The market dodged a bullet on June 24. Will market participants, and crucially regulators, take heed of the lessons of Brexit and take measures to ensure that the next time it isn’t a head shot?

I have my doubts. The clearing mandate is a reality, and is almost certain to remain one. The fundamental transformation of clearing (from credit risk to liquidity risk) is an inherent part of the mechanism. It’s effects can be at most ameliorated, and perhaps the Brexit tremor will provide some guidance on how to do that. But I doubt that whatever is done will make the system able to survive The Big One.

Print Friendly

October 12, 2016

A Pitch Perfect Illustration of Blockchain Hype

Filed under: Clearing,Commodities,Derivatives,Economics,Regulation — The Professor @ 7:31 pm

If you’ve been paying the slightest attention to financial markets lately, you’ll know that blockchain is The New Big Thing. Entrepreneurs and incumbent financial behemoths alike are claiming it will transform every aspect of financial markets.

The techno-utopianism makes me extremely skeptical. I will lay out the broader case for my skepticism in a forthcoming post. For now, I will discuss a specific example that illustrates odd combination of cluelessness and hype that characterizes many blockchain initiatives.

Titled “Blockchain startup aims to replace clearinghouses,” the article breathlessly states:

Founded by two former traders at Societe Generale, SynSwap is a post-trade start-up based on hyperledger technology designed to disintermediate central counterparties (CCPs) from the clearing process, effectively removing their role in key areas.

“For now we are focusing on interest rate swaps and credit default swaps, and will further develop the platform for other asset classes,” says Sophia Grami, co-founder of SynSwap.

Grami explains that once a trade is captured, SynSwap automatically processes the whole post-trade workflow on its blockchain platform. Through smart contracts, it can perform key post-trade functions such as matching and affirmation, generation of the confirmation, netting, collateral management, compression, default management and settlement.

“CCPs have been created to reduce systemic risk and remove counterparty risk through central clearing. While clearing is key to mitigate risks, the blockchain technology allows us to disintermediate CCPs while providing the same risk mitigation techniques,” Grami adds.

“Central clearing is turned into distributed clearing. There is no central counterparty anymore and no entity is in the middle of a trade anymore.”

The potential disruptive force blockchain technology could have for derivatives clearing could bring back banks that have pulled away from the business due to heightened regulatory costs.

I have often noted that CCPs offer a bundle of many services, and it is possible to considering unbundling some of them. But there are certain core functions of CCP clearing that this blockchain proposal does not offer. Most importantly, CCPs mutualize default risk: this is truly one of the core features of a CCP. This proposal does not, meaning that it provides a fundamentally different service than a CCP. Further, CCPs hedge and manage defaulted positions and port customer positions from a defaulted intermediary to a solvent one: this proposal does not. CCPs also manage liquidity risk. For instance, a defaulter’s collateral may not be immediately convertible into cash to pay winning counterparties, but the CCP maintains liquidity reserves and lines that it can use to intermediate liquidity in these circumstances. The proposal does not. The proposal mentions netting, but I seriously doubt that the blockchain–hyperledger, excuse me–can perform multilateral netting like a CCP.

There are other issues. Who sets the margin levels? Who sets the daily (or intraday) marks which determine variation margin flows and margin calls to top up IM? CCPs do that. Who does it for the hyper ledger?

So the proposal does some of the same things as a CCP, but not all of them, and in fact omits the most important bits that make central clearing central clearing. To the extent that these other CCP services add value–or regulation compels market participants to utilize a CCP that offers these services–market participants will choose to use a CCP, rather than this service. It is not a perfect substitute for central clearing, and will not disintermediate central clearing in cases where the services it does not offer and the functions it does not perform are demanded by market participants, or by regulators.

The co-founder says “[c]entral clearing is turned into distributed clearing.” Er, “distributed clearing”–AKA “bilateral OTC market.” What is being proposed here is not something really new: it is an application of a new technology to a very old, and very common, way of transacting. And by its nature, such a distributed, bilateral system cannot perform some functions that inherently require multilateral cooperation and centralization.

This illustrates one of my general gripes about blockchain hype: blockchain evangelists often claim to offer something new and revolutionary but what they actually describe often involves re-inventing the wheel. Maybe this wheel has advantages over existing wheels, but it’s still a wheel.

Furthermore, I would point out that this wheel may have some serious disadvantages as compared to existing wheels, namely, the bilateral OTC market as we know it. In some respects, it introduces one of the most dangerous features of central clearing into the bilateral market. (H/T Izabella Kaminska for pointing this out.) Specifically, as I’ve been going on about for about 8 years now, the rigid variation margining mechanism inherent in central clearing creates a tight coupling that can lead to catastrophic failure. Operational or financial delays that prevent timely payment of variation margin can force the CCP into default, or force it or its members to take extraordinary measures to access liquidity during times when liquidity is tight. Everything in a cleared system has to perform like clockwork, or an entire CCP can fail. Even slight delays in receiving payments during periods of market stress (when large variation margin flows occur) can bring down a CCP.

In contrast, there is more play in traditional bilateral contracting. It is not nearly so tightly coupled. One party not making a margin call at the precise time does not threaten to bring down the entire system. Furthermore, in the bilateral world, the “FU Option” is often quite systemically stabilizing. During the lead up to the crisis, arguments over marks could stretch on for days and sometimes weeks, giving some breathing room to stump up the cash to meet margin calls, and to negotiate down the size of the calls.

The “smart contracts” aspect of the blockchain proposal jettisons that. Everything is written in the code, the code is the last word, and will be self-executing. This will almost certainly create tight coupling: The Market has moved by X; contract says that means party A has to pay Party B Y by 0800 tomorrow or A is in default. (One could imagine writing really, really smart contracts that embed various conditions that mimic the flexibility and play in face-to-face bilateral markets, but color me skeptical–and this conditionality will create other issues, as I’ll discuss in the future post.)

When I think of these “smart contracts” one image that comes to mind is the magic broomsticks in The Sorcerer’s Apprentice. They do EXACTLY what they are commanded to do by the apprentice (coder?): they tote water, and end up toting so much water that a flood ensues. There is no feedback mechanism to get them to stop when the water gets too high. Again, perhaps it is possible to create really, really smart contracts that embed such feedback mechanisms.

But then one has to consider the potential interactions among a dense network of such really, really smart contracts. How do the feedbacks feed back on one another? Simple agent models show that agents operating subject to pre-programmed rules can generate complex, emergent orders when they interact. Sometimes these orders can be quite efficient. Sometimes they can crash and collapse.

In sum, the proposal for “distributed clearing to disintermediate CCPs” illustrates some of the defects of the blockchain movement. It overhypes what it does. It claims to be something new, when really it is a somewhat new way of doing something quite common. It does not necessarily perform these familiar functions better. It does not consider the systemic implications of what it does.

So why is there so much hype? Well, why was Pets.com a thing? More seriously, I think that there is an interesting sociological dynamic here. All the cool kids are talking about blockchain, and nobody wants to admit to not being cool. Further, when a critical mass of supposed thought leaders are doing something, others imitate for fear of being left behind: if you join and it turns out to be flop, well, you don’t stand out–everybody, including the smartest people, screwed up. You’re in good company! But if you don’t join and it becomes a hit, you look like a Luddite idiot and get left behind. So there is a bias towards joining the fad/jumping on the bandwagon.

I think there will be a role for blockchain. But I also believe that it will not be nearly as revolutionary as its most ardent proponents claim. And I am damn certain that it is not going to disintermediate central clearing, both because central clearing does some things “decentralized clearing” doesn’t (duh!), and because regulators like those things and are forcing their use.

Print Friendly

October 6, 2016

War Communism Meets Central Clearing

Filed under: Clearing,Derivatives,Economics,Politics,Regulation — The Professor @ 1:58 pm

I believe that I am on firm ground saying that I was one of the first to warn of the systemic risks created by the mandating of central clearing on a vast scale, and that CCPs could become the next Too Big to Fail entities. At ISDA events in 2011, moreover, I stated publicly that it was disturbing that the move to mandates was occurring before plans to recover or resolve insolvent clearinghouses were in place. At one of these events, in London, then-CEO of LCH Michael Davie said that it was important to ensure to have plans in place to deal with CCPs in wartime (meaning during crises) as well as in peace.

Well, we are five years on, and well after mandates have been in effect, those resolution and recovery authorities are moving glacially towards implementation. Several outlets report that the European Commission is finalizing legislation on CCP recovery. As Phil Stafford at the FT writes:

The burden of losses could fall on the clearing house or its parent company, its member banks; the banks’ customers, such as pension funds, or the taxpayer.

Brussels is proposing that clearing house members, such as banks, be required to participate in a cash call if the clearing house has exhausted its so-called “waterfall” of default procedures.

The participants would take a share in the clearing house in return, according to drafts seen by the Financial Times.

Authorities would also have the power to reduce the value of payments to the clearing house members, the draft says. In the event of a systemic crisis, regulators could use government money as long as doing so complies with EU rules on state aid.

Powers available to regulators would include tearing up derivatives contracts and applying a “haircut” to the margin or collateral that has been pledged by the clearing house’s end users.

Asset managers have long feared that haircutting margin would be tantamount to expropriating assets that belong to customers.

The draft is circulating in samizdat form, and I have seen a copy. It is rather breathtaking in its assertions of authority. Apropos Michael Davie’s remarks on operating CCPs during wartime, my first thought upon reading Chapters IV and V was “War Communism Comes to Derivatives.” One statement buried in the Executive Summary Sheet, phrased in bland bureaucratic language, is rather stunning in its import: “A recovery and resolution framework for CCPs is likely to involve a public authority taking extraordinary measures in the public interest, possibly overriding normal property rights and allocating losses to specific stakeholders.”

In a nutshell, the proposal says that the resolution authority can do pretty much it damn well pleases, including nullifying normal protections of bankruptcy/insolvency law, transferring assets to whomever it chooses, terminating contracts (not just of those who default, but any contract cleared by a CCP in resolution), bailing in any CCP creditor up to 100 percent, suspending the right to terminate contracts, and haircutting variation margin. The authority also has the power to force CCP members to make additional default fund contributions up to the amount of their original contribution, over and above any additional contribution specified in the CCP member agreement. In brief, the resolution authority has pretty much unlimited discretion to rob Peter to pay Paul, subject to only a few procedural safeguards.

About the only thing that the law doesn’t authorize is initial margin haircutting. Given the audacity of other powers that it confers, this is sort of surprising. It’s also not evident to me that variation margin haircutting is a better alternative. One often overlooked aspect of VM haircuts is that they hit hedgers hardest. Those who are using derivatives to manage risk look to variation margin payments to offset losses on other exposures that they are hedging. VM haircutting deprives them of some of these gains precisely when they are likely to need them most. Put differently, VM haircutting imposes losses on those that are least likely to be able to bear them when it is most costly to bear them. Hedgers are risk averse. One reason they are risk aversion is that losses on their underlying exposures could force them into financial distress. Blowing up their hedges could do just that.

Perhaps one could argue that CCPs are so systemically important and the implications of their insolvency are so ominous that extraordinary measures are necessary–in its Executive Summary, and in the proposal itself, the EC does just that. But this just calls into question the prudence of creating and supersizing entities with such latent destructive potential.

There is also a fundamental tension here. The potential that the resolution authority will impose large costs on members of CCPs, and even their customers, raises the burden of being a member, or trading cleared products. This is a disincentive to membership, and with the economics of supply clearing services already looking rather grim, may lead to further exits from the business. Similarly, bail-ins of creditors and the potential seizure of ownership interests without due process will make it more difficult for CCPs to obtain funding. Thus, mandating expansion of clearing makes necessary exceptional resolution measures that lead to reduced supply of clearing services, and reduced supply of the credit, liquidity, and capital that they need to function.

It must also be recognized that with discretionary power come inefficient selective intervention and influence costs. The resolution body will have extraordinary power to transfer vast sums from some agents to others. This makes it inevitable that the body will be subjected to intense rent seeking activity that will mean that its decisions will be driven as much by political factors as efficiency considerations, and perhaps more so: this is particularly true in Europe, where multiple states will push the interests of their firms and citizens. Rent seeking is costly. Furthermore, it will inevitably inject a degree of arbitrariness into the outcome of resolution. This arbitrariness creates additional uncertainty and risk, precisely at a time when these are already at heightened, and likely extreme, levels. Furthermore, it is likely to create dangerous feedback loops. The prospect of dealing with an arbitrary resolution mechanism will affect the behavior of participants in the clearing process even before a CCP fails, and one result could be to accelerate a crisis, as market participants look to cut their exposure to a teetering CCP, and do so in ways that pushes it over the edge.

To put it simply, if the option to resort to War Communism is necessary to deal with the fallout from a CCP failure in a post-mandate world, maybe you shouldn’t start the war in the first place.

Print Friendly

September 16, 2016

De Minimis Logic

CFTC Chair Timothy Massad has come out in support of a one year delay of the lowering of the de minimis swap dealer exemption notional amount from $8 billion to $3 billion. I recall Coase  (or maybe it was Stigler) writing somewhere that an economist could pay for his lifetime compensation by delaying implementation of an inefficient law by even a day. By that reckoning, by delaying the step down of the threshold for a year Mr. Massad has paid for the lifetime compensation of his progeny for generations to come, for the de minimis threshold is a classic analysis of an inefficient law. Mr. Massad (and his successors) could create huge amounts of wealth by delaying its implementation until the day after forever.

There are at least two major flaws with the threshold. The first is that there is a large fixed cost to become a swap dealer. Small to medium-sized swap traders who avoid the obligation of becoming swap dealers under the $8 billion threshold will not avoid it under the lower threshold. Rather than incur the fixed cost, many of those who would be caught with the lower threshold will decide to exit the business. This will reduce competition and increase concentration in the swap market. This is perversely ironic, given that one ostensible purpose of Frankendodd (which was trumpeted repeatedly by its backers) was to increase competition and reduce concentration.

The second major flaw is that the rationale for the swap dealer designation, and the associated obligations, is to reduce risk. Big swap dealers mean big risk, and to reduce that risk, they are obligated to clear, to margin non-cleared swaps, and hold more capital. But notional amount is a truly awful measure of risk. $X billion of vanilla interest rate swaps differ in risk from $X billion of CDS index swaps which differ in risk from $X billion of single name CDS which differ in risk from $X billion of oil swaps. Hell, $X billion of 10 year interest rate swaps differ in risk from $X billion of 2 year interest rate swaps. And let’s not even talk about the variation across diversified portfolios of swaps with the same notional values. So notional does not match up with risk in a discriminating way.  Further, turnover doesn’t measure risk very well either.

But hey! We can measure notional! So notional it is! Yet another example of the regulatory drunk looking for his keys under the lamppost because that’s where the light is.

So bully for Chairman Massad. He has delayed implementation of a regulation that will do the opposite of some of the things it is intended to do, and merely fails to do other things it is supposed to do. Other than that, it’s great!

Print Friendly

Next Page »

Powered by WordPress