Streetwise Professor

May 30, 2017

Clearing Fragmentation Follies: We’re From the European Commission, and We’re Here to Help You

Filed under: Clearing,Derivatives,Economics,Financial Crisis II,Politics,Regulation — The Professor @ 6:33 am

Earlier this month came news that the European Commission was preparing legislation that would require clearing of Euro derivatives to take place in the Eurozone, rather than in the UK, which presently dominates. This has been an obsession with the Euros since before Brexit: Brexit has only intensified the efforts, and provided a convenient rationalization for doing so.

The stated rationale is that the EU (and the ECB) need regulatory control over clearing of Euro-denominated derivatives because a problem at the CCP that clears them could have destabilizing effects on the Eurozone, and could necessitate the ECB providing liquidity support to the CCP in the event of trouble. If they are going to support it in extremis, they are going to need to have oversight, they claim.

Several things to note here. First, it is possible to have a regulatory line of sight without having jurisdiction. Note that the USD clearing business at LCH is substantially larger than the € clearing business there, yet the Fed, the Treasury, and Congress are fine with that, and are not insisting that all USD clearing be done stateside. They realize that there are other considerations (which I discuss more below): to simplify, they realize that London has become a dominant clearing center for good economic reasons, and that the economies of scale and scope clearing mean that concentration of clearing produces some efficiencies. Further, they realize that it is possible to have sufficient information to ensure that the foreign-domiciled CCP is acting prudently and not taking undue risks.

Canada is another example. A few years ago I wrote a white paper (under the aegis of the Canadian Market Infrastructure Committee) that argued that it would be efficient for Canada to permit clearing of C$ derivatives in London, rather than to require the establishment and use of a Canadian CCP. The Bank of Canada and the Canadian government agreed, and did not mandate the creation of a maple leaf CCP.

Second, if the Europeans think that by moving € clearing away from LCH that they will be immune from any problems there, they are sadly mistaken. The clearing firms that dominate in LCH will also be dominant in any Europe-domiciled € CCP, and a problem at LCH will be shared with the Euro CCP, either because the problem arises because of a problem at a firm that is a clearing member of both, or because an issue at LCH not originally arising from a CM problem will adversely affect all its CMs, and hence be communicated to other CCPs.  Consider, for example, the self-preserving way that LCH acted in the immediate aftermath of Brexit: this put liquidity demands on all its clearing members. With fragmented clearing, these strains would have been communicated to a Eurozone CCP.

When risks are independent, diversification and redundancy tend to reduce risk of catastrophic failure: when risks are not independent, they can either fail to reduce the risk substantially, or actually increase it. For instance, if the failure of CCP 1 likely causes the failure of CCP 2, having two CCPs actually increases the probability of a catastrophe (given a probability of CCP failure). CCP risks are not independent, but highly dependent. This means that fragmentation could well increase the problem of a clearing crisis, and is unlikely to reduce it.

This raises another issue: dealing with a crisis will be more complicated, the more fragmented is clearing. Two self-preserving CCPs have an incentive to take actions that may well hurt the other. Relatedly, managing the positions of a defaulted CM will be more complicated because this requires coordination across self-interested CCPs. Due to the breaking of netting sets, liquidity strains during a crisis are likely to be greater in a crisis with multiple CCPs (and here is where the self-preservation instincts of the two CCPs are likely to present the biggest problems).

Thus, (a) it is quite likely that fragmentation of clearing does not reduce, and may increase, the probability of a systemic shock involving CCPs, and (b) conditional on some systemic event, fragmented CCPs will respond less effectively than a single one.

The foregoing relates to how CCP fragmentation will affect markets during a systemic event. Fragmentation also affects the day-to-day economics of clearing. The breaking of netting sets resulting from the splitting off of € will increase collateral requirements. Perverse regulations, such as Basel III’s insistence on treating customer collateral as a CM asset against which capital must be held per the leverage requirement, will cause the collateral increase to increase substantially of providing clearing services.

Fragmentation will also result in costly duplication of activities, both across CCPs, and across CMs. For instance, it will entail duplicative oversight of CMs that clear both at LCH and the Eurozone CCP, and CMs that are members of both will have to staff separate interfaces with each. There will also be duplicative investments in IT (and the greater the number of IT potential points of failure, the greater the likelihood of at least one failure, which is almost certain to have deleterious consequences for CMs, and the other CCP). Fragmentation will also interfere with information flows, and make it likely that each CCP has less information than an integrated CCP would have.

This article raises another real concern: a Eurozone clearer is more likely to be subject to political pressure than the LCH. It notes that the Continentals were upset about the LCH raising haircuts on Eurozone sovereigns during the PIIGS crisis. In some future crisis (and there is likely to be one) the political pressure to avoid such moves will be intense, even in the face of a real deterioration of the creditworthiness of one or more EU states. Further upon a point made above, political pressures in the EU and the UK could exacerbate the self-preserving actions that could lead to a failure to achieve efficient cooperation in a crisis, and indeed, could lead to a catastrophic coordination failure.

In sum, it’s hard to find an upside to the forced repatriation of € clearing from LCH to some Eurozone entity. Both in wartime (i.e., a crisis) and in peacetime, there are strong economies of scale and scope in clearing. A forced breakup will sacrifice these economies. Indeed, since breaking up CCPs is unlikely to reduce the probability of a clearing-related crisis, but will make the crisis worse when it does occur, it is particularly perverse to dress this up as a way of protecting the stability of the financial system.

I also consider it sickly ironic that the Euros say, well, if we are expected to provide a liquidity backstop to a big financial entity, we need to have regulatory control. Um, just who was supplying all that dollar liquidity via swap lines to desperate European banks during the 2008-2009 crisis? Without the Fed, European banks would have failed to obtain the dollar funding they needed to survive. By the logic of the EC in demanding control of € clearing, the Fed should require that the US have regulatory authority over all banks borrowing and lending USD.

Can you imagine the squealing in Brussels and every European capital in response to any such demand?

Speaking of European capitals, there is another irony. One thing that may derail the EC’s clearing grab is a disagreement over who should have primary regulatory responsibility over a Eurozone CCP. The ECB and ESMA think the job should be theirs: Germany, France, and Italy say nope, this should be the job of national central banks  (e.g., the Bundesbank) or national financial regulators (e.g., Bafin).

So, hilariously, what may prevent (or at least delay) the fragmentation of clearing is a lack of political unity in the EU.  This is as good an illustration as any of the fundamental tensions within the EU. Everybody wants a superstate. As long as they are in control.

Ronald Reagan famously said that the nine scariest words in the English language are: “I’m from the government and I’m here to help.” I can top that: “I’m from the EC, and I’m here to help.” When it comes to demanding control of clearing, the EC’s “help” will be about as welcome as a hole in the head.

 

Print Friendly

April 15, 2017

Is the Order Handling Rule Necessary to Ensure Intense Competition in Securities Markets?

Filed under: Commodities,Derivatives,Economics,Exchanges,Regulation — The Professor @ 2:01 pm

A couple of weeks back Acting SEC Chairman Mike Piwowar announced a new Special Study of the Securities Markets, a reprise of the 1963 Special Study. This is an excellent idea, given that RegNMS (adopted in 2005) has (as was inevitable) spawned many unintended and unexpected consequences. Revision of this regulation in light of experience is almost certainly warranted, and any such revision should be predicated on sound scholarship, lest it be merely a Trojan Horse for vested interests arguing their books.

I wrote about RegNMS in Regulation at the time of its adoption in a piece titled “The Thirty Years War” (an allusion to the fact that the establishment of the National Market System in 1975 had sparked a continuing clash over securities market structure). Overall, I think that piece stands up well, particularly my concluding paragraph:

Therefore, the proposed rules are not the final battle in a Thirty Years War. I fully expect that in 2075, some professor will write an article about the latest clash in an ongoing Hundred Years War over securities market structure regulation.

It is certainly the case that the controversies and conflicts over market structure have continued unabated since 2005, and show no signs of letting up. (Cf. Flash Boys.) Chairman Piwowar’s call for a new Special Study is testament to that.

More specifically, the major prediction of my article has been fully borne out. I predicted that the Order Protection Rule in particular would break the network effect that resulted in the dominance of the NYSE in the securities it listed. Since RegNMS was passed, the highly concentrated listed stock market (where virtually all price discovering transactions in NYSE stocks occurred on the NYSE) has been utterly transformed, with four exchanges now splitting most of the business, with no exchange doing more than a quarter of the volume.

I further predicted that this would result in the disintermediation of traditional intermediaries–like specialists–and the substantial erosion of economic rents. This too has happened. This is best illustrated by the trajectory of Goldman’s investment in specialist firm Spear, Leeds & Kellogg. Goldman paid $5.4 billion for it in 2000 (before RegNMS) and sold it for a pittance–$30 million–in 2014. I didn’t foresee exactly the nature or identity of the new intermediaries–HFT–but I was broadly aware that there would be entry into market making, and that this would reduce trading costs and undermine incumbents with market power. Further, as I’ve written about recently, the new intermediaries don’t appear to be making rents in the new equilibrium.

The years since RegNMS have seen a dramatic decline in trading costs for investors, and it is likely the case that this decline is largely attributable to the increase in competition. Much of the controversy that has raged since 2005 relates to disputes over trading practices that were an inevitable consequence of the breaking of the NYSE near-monopoly–a process pejoratively referred to as “fragmentation.” In particular, multiple markets necessitate arbitrageurs, who effectively enforce the law of one price. The strategies and tactics arbitraguers use often appear unsavory, and strike many as unfair: arbitrageurs get something even though they appear to do nothing substantive. Moreover, arbitrage uses up real resources. That’s costly, and it would be nice if this could be avoided, but that’s unlikely ever to be so. The trade-off between much greater competition (and reduced welfare losses due to the exercise of market power) and the expenditure of real resources to enforce the law of one price seems to be a great bargain.

Much of the criticism of RegNMS relates to the Order Protection Rule, which requires that no order can be executed on market X if a better price is displayed at market Y. The critics (e.g., the Principal Traders Association which ironically represents some of the biggest beneficiaries of RegNMS) argue that this rule (a) has led to a proliferation of order types intended to ensure compliance with the rule, which make the market far more complex, and (b) requires traders to maintain connections with and monitor all trading venues displaying quotes, no matter how small.

These complaints have some merit. The crucial question is whether the equity trading marketplace will be as competitive without the Order Handling Rule as it is with it. This is an open question, and one which should be the focus of the SEC’s inquiry. For if the Order Handling Rule is a necessary condition for robust competition, the costs that the PTA and others identify are likely well worth paying in order to realize the benefits of competition.

My prediction that competition would intensify post-RegNMS was based on my analysis of the effects of the Order Handling Rule, which was in turn based on my work on liquidity network effects done in the late-90s and early-00s. Specifically, in the formal models I derived (e.g., here), the self-reinforcing liquidity effect obtains when investors decide which trading venue to submit an order to on the basis of expected execution cost (i.e., bid-ask spread, price impact). The market with the bigger fraction of trading activity typically offers the lowest execution cost. Therefore, traders submit their orders to the bigger market. This creates a self-reinforcing feedback loop (and a self-fulling prophecy) in which trading activity “tips” to a single exchange. (There are some complexities here, relating to cream skimming of uninformed order flow. See the linked paper for a discussion of that issue.)

Mandating something akin to to the order handling rule forces order flow to the market offering the best price at a particular moment, not the one that offers the best price in expectation. As I phrased it in my Regulation paper, such a rule “socializes order flow”: even if an order is directed to a particular exchange, that exchange does not control that order flow and must direct to any other exchange offering a better price.

I think that both theory and the post-RegNMS experience show that the Order Handling Rule is sufficient to break the liquidity network effect because it socializes order flow. But is it necessary? Maybe not, but it is important to try to find out before jettisoning it.

Here’s a story which suggests that the rule is not necessary in the modern electronic trading environment. One reason why traders may choose to submit orders to where they expect to get the best execution is because of search costs. In a floor-based environment in particular, it is costly to verify which market is offering the best price at any time.  Moreover, since it takes time get quotes from two floor-based markets, by the time that you actually submit your order to the one giving the best quote, the market will have moved and you won’t get the price you thought you were going to get. So economize on search costs and the risks associated with delay by submitting the order to the market that usually offers the best price. Ironically, the inevitable result of this process is that there is only one market left standing.

Search is cheaper and faster–and arguably far cheaper and far faster–in the modern electronic environment. Based on feeds from multiple markets, an electronic trader (and in particular an automated trader) can rapidly compare quotes and send an order to the market offering the best quote, or by viewing depth (something pretty much impossible in the floor days, where much of the liquidity was in the hands of floor brokers) split an order among multiple venues to tap the liquidity in all of them.

In other words, the natural monopoly problem was far more likely in a floor-based environment where pre-trade transparency was so limited that search costs were very high: it was nigh on impossible to know precisely what trading opportunities were or to move fast enough to exploit the one that appeared best at any point in time, so traders submitted their orders to where they expected the opportunities to be the best. In contrast, electronification and automation have created such great pre-trade transparency and the ability to act on it that it is plausibly true that in this environment traders can and will submit their orders to whatever venue is offering the best trading opportunity at a point in time, regardless of whether it usually does so. In this story, technology eliminates the uncertainty and guesswork that created the liquidity network effect.

Maybe. Perhaps even likely. But I can’t be certain. Note that one complaint about the existing market structure is that even though everything has vastly speeded up, some traders are still faster than others. As a result, those who submit a market order in response to seeing a particular displayed price are often dismayed to learn that the market has moved before their order actually reaches the trading venue, and that their order is executed at a worse price than they had anticipated. Freed of the obligations of the Order Handling Rule, these traders may choose to submit their order to where they usually get the best price: if enough do this, the liquidity network effect will reemerge.

Further, the PTA and others have complained that it is costly to monitor and maintain connections with all trading venues as is necessary under the Order Handling Rule. If the Rule is relaxed or eliminated, one would expect that they will disconnect from some venues. If enough do this, the smaller venues will become unviable. After this happens, there will be fewer venues–and some traders may choose to disconnect from the smallest remaining one. This dynamic could result in another feedback loop that results in the survival of a single dominant exchange that exercises market power.

It is therefore not clear to me that elimination of the Order Handling Rule will result in traders having their cake (intense inter-exchange competition) and eating it too (less complexity, lower connection cost). Given the substantial benefits of greater competition that have been realized in the past dozen years, changes to the cornerstone of RegNMS should not be taken lightly. The Special Study, and the SEC, should pay close attention to how competition will evolve if the Order Handling Rule is eliminated. This analysis should take into account the existing technology, but also try to think of how technology will change in the aftermath of an elimination and how this technological change will affect competition.

Most importantly, any analysis must be predicated on an understanding that there are strong centripetal forces in securities trading. Any time traders have an incentive to direct order flow to the venue that is expected to offer the best price, the likely outcome is that only one venue will survive. The incentives of traders in a high speed, largely automated, and electronic market in the absence of an Order Handling Rule need to be considered carefully. It should not be assumed that technology alone will eliminate the incentive to direct orders to the market that is usually best, not the one that is best at any particular instant. This hypothesis should be probed vigorously and skeptically.

Experience in futures markets suggests that liquidity network effects can persist even in high speed, automated, electronic markets: futures contracts in a particular instrument exhibit a strong natural monopoly tendency, and strong tendencies towards tipping. It is arguable that the vertical integration of clearing, and the resulting non-fungibility of otherwise identical contracts traded on different venues, could contribute to this (though I am skeptical about that). But it could also mean that something like the Order Handling Rule (which is not present in futures markets) is necessary to create strong competition between multiple venues even in a highly computerized and automated trading environment.

This is the big issue in any revamping of RegNMS. It should be front and center of any analysis, including in the impending Special Study. The intense competition in the post-RegNMS world is a remarkable achievement, particularly in comparison with the near monopolistic market structure that existed before 2005. It would be a great shame if this were thrown away due to an incomplete analysis of what competition in a modern computerized market would be like in the absence of something like the Order Handing Rule.

Print Friendly

April 14, 2017

SWP Climbs The Hill

Filed under: Derivatives,Economics,Exchanges,Regulation — The Professor @ 10:40 am

I have become a regular contributor to The Hill. My inaugural column on the regulation of spoofing is here. The argument in a nutshell is that: (a) spoofing involves large numbers of cancellations, but so do legitimate market making strategies, so there is a risk that aggressive policing of spoofing will wrongly penalize market makers, thereby raising the costs of supplying liquidity; (b) the price impacts of spoofing are very, very small, and transitory; (c) enforcement authorities sometimes fail to pursue manipulations that have far larger price impacts; therefore (d) a focus on spoofing is a misdirection of scarce enforcement resources.

My contributions will focus on finance and regulatory issues. So those looking for my trenchant political commentary will have to keep coming here 😉

Click early! Click often!

Print Friendly

April 4, 2017

The Unintended Consequences of Blockchain Are Not Unpredictable: Respond Now Rather Than Repent Later*

Filed under: Clearing,Commodities,Derivatives,Economics,Regulation — The Professor @ 3:39 pm

In the past week the WSJ and the FT have run articles about a new bank-led initiative to move commodity trading onto a blockchain. In many ways, this makes great sense. By its nature, the process of recording and trading commodity trades and shipments is (a) collectively involve large numbers of spatially dispersed counterparties, (b) have myriad terms, and (c) can give rise to costly disputes. As a result of these factors, the process is currently very labor intensive, fraught with operational risk (e.g., inadvertent errors) and vulnerable to fraud (cf., the Qingdao metals warehouse scandal of 2014). In theory, blockchain has the ability to reduce costs, errors, and fraud. Thus, it is understandable that traders and banks are quite keen on the potential of blockchain to reduce costs and perhaps even revolutionize the trading business.

But before you get too excited, a remark by my friend Christophe Salmon at Trafigura is latent with deep implications that should lead you to take pause and consider the likely consequences of widespread adoption of blockchain:

Christophe Salmon, Trafigura’s chief financial officer, said there would need to be widespread adoption by major oil traders and refiners to make blockchain in commodity trading viable in the long term.

This seemingly commonsense and innocuous remark is actually laden with implications of unintended consequences that should be recognized and considered now, before the blockchain train gets too far down the track.

In essence, Christophe’s remark means that to be viable blockchain has to scale. If it doesn’t scale, it won’t reduce cost. But if it does scale, a blockchain for a particular application is likely to be a natural monopoly, or at most a natural duopoly. (Issues of scope economies are also potentially relevant, but I’ll defer discussion of that for now.)

Indeed, if there are no technical impediments to scaling (which in itself is an open question–note the block size debate in Bitcoin), the “widespread adoption” feature that Christophe identifies as essential means that network effects create scale economies that are likely to result in the dominance of a single platform. Traders will want to record their business on the blockchain that their counterparties use. Since many trade with many, this creates a centripetal force that will tend to draw everyone to a single blockchain.

I can hear you say: “Well, if there is a public blockchain, that happens automatically because everyone has access to it.” But the nature of public blockchain means that it faces extreme obstacles that make it wildly impractical for commercial adoption on the scale being considered not just in commodity markets, but in virtually every aspect of the financial markets. Commercial blockchains will be centrally governed, limited access, private systems rather than a radically decentralized, open access, commons.

The “forking problem” alone is a difficulty. As demonstrated by Bitcoin in 2013 and Ethereum in 2016, public blockchains based on open source are vulnerable to “forking,” whereby uncoordinated changes in the software (inevitable in an open source system that lacks central governance and coordination) result in the simultaneous existence of multiple, parallel blockchains. Such forking would destroy the network economy/scale effects that make the idea of a single database attractive to commercial participants.

Prevention of forking requires central governance to coordinate changes in the code–something that offends the anarcho-libertarian spirits who view blockchain as a totally decentralized mechanism.

Other aspects of the pure version of an open, public blockchain make it inappropriate for most financial and commercial applications. For instance, public blockchain is touted because it does not require trust in the reputation of large entities such as clearing networks or exchanges. But the ability to operate without trust does not come for free.

Trust and reputation are indeed costly: as Becker and Stigler first noted decades ago, and others have formalized since, reputation is a bonding mechanism that requires the trusted entity to incur sunk costs that would be lost if it violates trust. (Alternatively, the trusted entity has to have market power–which is costly–that generates a stream of rents that is lost when trust is violated. That is, to secure trust prices have to be higher and output lower than would be necessary in a zero transactions cost world.)

But public blockchains have not been able to eliminate trust without cost. In Bitcoin, trust is replaced with “proof of work.” Well, work means cost. The blockchain mining industry consumes vast amounts of electricity and computing power in order to prove work. It is highly likely that the cost of creating trusted entities is lower than the cost of proof of work or alternative ways of eliminating the need for trust. Thus, a (natural monopoly) commercial blockchain is likely to have to be a trusted centralized institution, rather than a decentralized anarchist’s wet-dream.

Blockchain is also touted as permitting “smart contracts,” which automatically execute certain actions when certain pre-defined (and coded) contingencies are met. But “smart contracts” is not a synonym for “complete contracts,” i.e., contracts where every possible contingency is anticipated, and each party’s actions under each contingency is specified. Thus, even with smart (but incomplete) contracts, there will inevitably arise unanticipated contingencies.

Parties will have to negotiate what to do under these contingencies. Given that this will usually be a bilateral bargaining situation under asymmetric information, the bargaining will be costly and sometimes negotiations will break down. Moreover, under some contingencies the smart contracts will automatically execute actions that the parties do not expect and would like to change: here, self-execution prevents such contractual revisions, or at least makes them very difficult.

Indeed, it may be the execution of the contractual feature that first makes the parties aware that something has gone horribly wrong. Here another touted feature of pure blockchain–immutability–can become a problem. The revelation of information ex post may lead market participants to desire to change the terms of their contract. Can’t do that if the contracts are immutable.

Paper and ink contracts are inherently incomplete too, and this is why there are centralized mechanisms to address incompleteness. These include courts, but also, historically, bodies like stock or commodity exchanges, or merchants’ associations (in diamonds, for instance) have helped adjudicate disputes and to re-do deals that turn out to be inefficient ex post. The existence of institutions to facilitate the efficient adaption of parties to contractual incompleteness demonstrates that in the real world, man does not live (or transact) by contract alone.

Thus, the benefits of a mechanism for adjudicating and responding to contractual incompleteness create another reason for a centralized authority for blockchain, even–or especially–blockchains with smart contracts.

Further, the blockchain (especially with smart contracts) will be a complex interconnected system, in the technical sense of the term. There will be myriad possible interactions between individual transactions recorded on the system, and these interactions can lead to highly undesirable, and entirely unpredictable, outcomes. A centralized authority can greatly facilitate the response to such crises. (Indeed, years ago I posited this as one of the reasons for integration of exchanges and clearinghouses.)

And the connections are not only within a particular blockchain. There will be connections between blockchains, and between a blockchain and other parts of the financial system. Consider for example smart contracts that in a particular contingency dictate large cash flows (e.g., margin calls) from one group of participants to another. This will lead to a liquidity shock that will affect banks, funding markets, and liquidity supply mechanisms more broadly. Since the shock can be destabilizing and lead to actions that are individually rational but systemically destructive if uncoordinated, central coordination can improve efficiency and reduce the likelihood of a systemic crisis. That’s not possible with a radically decentralized blockchain.

I could go on, but you get the point: there are several compelling reasons for centralized governance of a commercial blockchain like that envisioned for commodity trading. Indeed, many of the features that attract blockchain devotees are bugs–and extremely nasty ones–in commercial applications, especially if adopted at large scale as is being contemplated. As one individual who works on commercializing blockchain told me: “Commercial applications of blockchain will strip out all of the features that the anarchists love about it.”

So step back for a minute. Christophe’s point about “widespread adoption” and an understanding of the network economies inherent in the financial and commercial applications of blockchain means that it is likely to be a natural monopoly in a particular application (e.g., physical oil trading) and likely across applications due to economies of scope (which plausibly exist because major market participants will transact in multiple segments, and because of the ability to use common coding across different applications, to name just two factors). Second, a totally decentralized, open access, public blockchain has numerous disadvantages in large-scale commercial applications: central governance creates value.

Therefore, commercial blockchains will be “permissioned” in the lingo of the business. That is, unlike public blockchain, entry will be limited to privileged members and their customers. Moreover, the privileged members will govern and control the centralized entity. It will be a private club, not a public commons. (And note that even the Bitcoin blockchain is not ungoverned. Everyone is equal, but the big miners–and there are now a relatively small number of big miners–are more equal than others. The Iron Law of Oligarchy applies in blockchain too.)

Now add another factor: the natural monopoly blockchain will likely not be contestible, for reasons very similar to the ones I have written about for years to demonstrate why futures and equity exchanges are typically natural monopolies that earn large rents because they are largely immune from competitive entry. Once a particular blockchain gets critical mass, there will be the lock-in problem from hell: a coordinated movement of a large set of users from the incumbent to a competitor will be necessary for the entrant to achieve the scale necessary to compete. This is difficult, if not impossible to arrange. Three Finger Brown could count the number of times that has happened in futures trading on his bad hand.

Now do you understand why banks are so keen on the blockchain? Yes, they couch it in terms of improving transactional efficiency, and it does that. But it also presents the opportunity to create monopoly financial market infrastructures that are immune from competitive entry. The past 50 years have seen an erosion of bank dominance–“disintermediation”–that has also eroded their rents. Blockchain gives the empire a chance to strike back. A coalition of banks (and note that most blockchain initiatives are driven by a bank-led cooperative, sometimes in partnership with a technology provider or providers) can form a blockchain for a particular application or applications, exploit the centripetal force arising from network effects, and gain a natural monopoly largely immune from competitive entry. Great work if you can get it. And believe me, the banks are trying. Very hard.

Left to develop on its own, therefore, the blockchain ecosystem will evolve to look like the exchange ecosystem of the 19th or early-20th centuries. Monopoly coalitions of intermediaries–“clubs” or “cartels”–offering transactional services, with member governance, and with the members reaping economic rents.

Right now regulators are focused on the technology, and (like many others) seem to be smitten with the potential of the technology to reduce certain costs and risks. They really need to look ahead and consider the market structure implications of that technology. Just as the natural monopoly nature of exchanges eventually led to intense disputes over the distribution of the benefits that they created, which in turn led to regulation (after bitter political battles), the fundamental economics of blockchain are likely to result in similar conflicts.

The law and regulation of blockchain is likely to be complicated and controversial precisely because natural monopoly regulation is inherently complicated and controversial. The yin and yang of financial infrastructure in particular is that the technology likely makes monopoly efficient, but also creates the potential for the exercise of market power (and, I might add, the exercise of political power to support and sustain market power, and to influence the distribution of rents that result from that market power). Better to think about those things now when things are still developing, than when the monopolies are developed, operating, and entrenched–and can influence the political and regulatory process, as monopolies are wont to do.

The digital economy is driven by network effects: think Google, Facebook, Amazon, and even Twitter. In addition to creating new efficiencies, these dominant platforms create serious challenges for competition, as scholars like Ariel Ezrachi and Maurice Stucke have shown:

Peter Thiel, the successful venture capitalist, famously noted that ‘Competition Is for Losers.’ That useful phrase captures the essence of many technology markets. Markets in which the winner of the competitive process is able to cement its position and protect it. Using data-driven network effects, it can undermine new entry attempts. Using deep pockets and the nowcasting radar, the dominant firm can purchase disruptive innovators.

Our new economy enables the winners to capture much more of the welfare. They are able to affect downstream competition as well as upstream providers. Often, they can do so with limited resistance from governmental agencies, as power in the online economy is not always easily captured using traditional competition analysis. Digital personal assistants, as we explore, have the potential to strengthen the winner’s gatekeeper power.

Blockchain will do the exact same thing.

You’ve been warned.

*My understanding of these issues has benefited greatly from many conversations over the past year with Izabella Kaminska, who saw through the hype well before pretty much anyone. Any errors herein are of course mine.

Print Friendly

March 27, 2017

Seeing the OTC Derivatives Markets (and the Financial Markets) Like a State

Filed under: Clearing,Derivatives,Economics,Regulation — The Professor @ 12:07 pm

In the years since the financial crisis, and in particular the period preceding and immediately following the passage of Frankendodd, I can’t tell you how many times I saw diagrams that looked like this:

YellenCCPDiagram_1

YellenCCPDiagram_2

The top diagram is a schematic representation of an OTC derivatives market, with a tangle of bilateral connections between counterparties. The second is a picture of a hub-and-spoke trading network with a CCP serving as the hub. (These particular versions of this comparison are from a 2013 Janet Yellen speech.)

These diagrams came to mind when re-reading James Scott’s Seeing Like a State and his Two Cheers for Anarchism. Scott argues that states have an obsession with making the societies they rule over “legible” in order to make them easier to tax, regulate, and control. States are confounded by evolved complexity and emergent orders: such systems are difficult to comprehend, and what cannot be comprehended cannot be easily ruled. So states attempt to impose schemes to simplify such complex orders. Examples that Scott gives include standardization of language and suppression of dialects; standardization of land tenure, measurements, and property rights; cadastral censuses; population censuses; the imposition of familial names; and urban renewal (e.g., Hausmann’s/Napoleon III’s massive reconstruction of Paris). These things make a populace easier to tax, conscript, and control.

Complex realities of emergent orders are too difficult to map. So states conceive of a mental map that is legible to them, and then impose rules on society to force it to conform with this mental map.

Looking back at the debate over OTC markets generally, and clearing, centralized execution, and trade reporting in particular, it is clear that legislators and regulators (including central banks) found these markets to be illegible. Figures like the first one–which are themselves a greatly simplified representation of OTC reality–were bewildering and disturbing to them. The second figure was much more comprehensible, and much more comforting: not just because they could comprehend it better, but because it gave them the sense that they could impose an order that would be easier to monitor and control. The emergent order was frightening in its wildness: the sense of imposing order and control was deeply comforting.

But as Scott notes, attempts to impose control on emergent orders (which in Scott’s books include both social and natural orders, e.g., forests) themselves carry great risks because although hard to comprehend, these orders evolved the way they did for a reason, and the parts interact in poorly understood–and sometimes completely not understood–ways. Attempts to make reality fit a simple mental map can cause the system to react in unpredicted and unpredictable ways, many of which are perverse.

My criticism of the attempts to “reform” OTC markets was largely predicated on my view that the regulators’ simple mental maps did great violence to complex reality. Even though these “reform” efforts were framed as ways of reducing systemic risk, they were fatally flawed because they were profoundly unsystemic in their understanding of the financial system. My critique focused specifically on the confident assertions based on the diagrams presented above. By focusing only on the OTC derivatives market, and ignoring the myriad connections of this market to other parts of the financial market, regulators could not have possibly comprehended the systemic implications of what they were doing. Indeed, even the portrayal of the OTC market alone was comically simplistic. The fallacy of composition played a role here too: the regulators thought they could reform the system piece-by-piece, without thinking seriously about how these pieces interacted in non-linear ways.

The regulators were guilty of the hubris illustrated beautifully by the parable of Chesterton’s Fence:

In the matter of reforming things, as distinct from deforming them, there is one plain and simple principle; a principle which will probably be called a paradox. There exists in such a case a certain institution or law; let us say, for the sake of simplicity, a fence or gate erected across a road. The more modern type of reformer goes gaily up to it and says, “I don’t see the use of this; let us clear it away.” To which the more intelligent type of reformer will do well to answer: “If you don’t see the use of it, I certainly won’t let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to destroy it.

In other words, the regulators should have understood the system and why it evolved the way that it did before leaping in to “reform” it. As Chesterton says, such attempts at reformation quite frequently result in deformation.

Somewhat belatedly, there are efforts underway to map the financial system more accurately. The work of Richard Bookstaber and various colleagues under the auspices of the Office of Financial Research to create multilayer maps of the financial system is certainly a vast improvement on the childish stick figure depictions of Janet Yellen, Gary Gensler, Timmy Geithner, Chris Dodd, Barney Frank et al. But even these more sophisticated maps are extreme abstractions, not least because they cannot capture incentives, the distribution of information among myriad market participants, and the motivations and behaviors of these participants. Think of embedding these maps in the most complicated extensive form large-N player game you can imagine, and you might have some inkling of how inadequate any schematic representation of the financial system is likely to be. When you combine this with the fact that in complex systems, even slight changes in initial conditions can result in completely different outcomes, the futility of “seeing like a state” in this context becomes apparent. The map of initial conditions is inevitably crude, making it an unreliable guide to understanding the system’s future behavior.

In my view, Scott goes too far. There is no doubt that some state-driven standardization has dramatically reduced transactions costs and opened up new possibilities for wealth-enhancing exchanges (at some cost, yes, but these costs are almost certainly less than the benefit), but Scott looks askance at virtually all such interventions. Thus, I do not exclude the possibility of true reform. But Scott’s warning about the dangers of forcing complex emergent orders to conform to simplified, “legible”, mental constructs must be taken seriously, and should inform any attempt to intervene in something like the financial system. Alas, this did not happen when legislators and regulators embarked on their crusade to reorganize wholesale the world financial system. It is frightening indeed to contemplate that this crusade was guided by such crude mental maps such as those supposedly illustrating the virtues of moving from an emergent bilateral OTC market to a tamed hub-and-spoke cleared one.

PS. I was very disappointed by this presentation by James Scott. He comes off as a doctrinaire leftist anthropologist (but I repeat myself), which is definitely not the case in his books. Indeed, the juxtaposition of Chesterton and Scott shows how deeply conservative Scott is (in the literal sense of the word).

Print Friendly

March 24, 2017

Creative Destruction and Industry Life Cycles, HFT Edition

Filed under: Derivatives,Economics,Exchanges,Regulation — The Professor @ 11:56 am

No worries, folks: I’m not dead! Just a little hiatus while in Geneva for my annual teaching gig at Université de Genève, followed by a side trip for a seminar (to be released as a webinar) at ESSEC. The world didn’t collapse without my close attention, but at times it looked like a close run thing. But then again, I was restricted to watching CNN so my perception may be a little bit warped. Well, not a little bit: I have to say that I knew CNN was bad, but I didn’t know how bad until I watched a bit while on the road. Appalling doesn’t even come close to describing it. Strident, tendentious, unrelentingly biased, snide. I switched over to RT to get more reasonable coverage. Yes. It was that bad.

There are so many allegations regarding surveillance swirling about that only fools would rush in to comment on that now. I’ll be an angel for once in the hope that some actual verifiable facts come out.

So for my return, I’ll just comment on a set of HFT-related stories that came out during my trip. One is Alex Osipovich’s story on HFT traders falling on hard times. Another is that Virtu is bidding for KCG. A third one is that Quantlabs (a Houston outfit) is buying one-time HFT high flyer Teza. And finally, one that pre-dates my trip, but fits the theme: Thomas Peterffy’s Interactive Brokers Group is exiting options market making.

Alex’s story repeats Tabb Group data documenting a roughly 85 percent drop in HFT revenues in US equity trading. The Virtu-KCG proposed tie-up and the Quantlabs-Teza consummated one are indications of consolidation that is typical of maturing industries, and a shift it the business model of these firms. The Quantlabs-Teza story is particularly interesting. It suggests that it is no longer possible (or at least remunerative) to get a competitive edge via speed alone. Instead, the focus is shifting to extracting information from the vast flow of data generated in modern markets. Speed will matter here–he who analyzes faster, all else equal, will have an edge. But the margin for innovation will shift from hardware to data analytics software (presumably paired with specialized hardware optimized to use it).

None of these developments is surprising. They are part of the natural life cycle of a new industry. Indeed, I discussed this over two years ago:

In fact, HFT has followed the trajectory of any technological innovation in a highly competitive environment. At its inception, it was a dramatically innovative way of performing longstanding functions undertaken by intermediaries in financial markets: market making and arbitrage. It did so much more efficiently than incumbents did, and so rapidly it displaced the old-style intermediaries. During this transitional period, the first-movers earned supernormal profits because of cost and speed advantages over the old school intermediaries. HFT market share expanded dramatically, and the profits attracted expansion in the capital and capacity of the first-movers, and the entry of new firms. And as day follows night, this entry of new HFT capacity and the intensification of competition dissipated these profits. This is basic economics in action.

. . . .

Whether it is by the entry of a new destructively creative technology, or the inexorable forces of entry and expansion in a technologically static setting, one expects profits earned by firms in one wave of creative destruction to decline.  That’s what we’re seeing in HFT.  It was definitely a disruptive technology that reaped substantial profits at the time of its introduction, but those profits are eroding.

That shouldn’t be a surprise.  But it no doubt is to many of those who have made apocalyptic predictions about the machines taking over the earth.  Or the markets, anyways.

Or, as Herb Stein famously said as a caution against extrapolating from current trends, “If something cannot go on forever, it will stop.” Those making dire predictions about HFT were largely extrapolating from the events of 2008-2010, and ignored the natural economic forces that constrain growth and dissipate profits. HFT is now a normal, competitive business earning normal, competitive profits.  And hopefully this reality will eventually sink in, and the hysteria surrounding HFT will fade away just as its profits did.

The rise and fall of Peterffy/Interactive illustrates Schumpeterian creative destruction in action. Interactive was part of a wave of innovation that displaced the floor. Now it can’t compete against HFT. And as the other articles show, HFT is in the maturation stage during which profits are competed away (ironically, a phenomenon that was central to Marx’s analysis, and which Schumpeter’s theory was specifically intended to address).

This reminds me of a set of conversations I had with a very prominent trader. In the 1990s he said he was glad to see that the markets were becoming computerized because he was “tired of being fucked by the floor.” About 10 years later, he lamented to me how he was being “fucked by HFT.” Now HFT is an industry earning “normal” profits (in the economics lexicon) due to intensifying competition and technological maturation: the fuckers are fucking each other now, I guess.

One interesting public policy issue in the Peterffy story is the role played by internalization of order flow in undermining the economics of Interactive: there is also an internalization angle to the Virtu-KCG story, because one reason for Virtu to buy KCG is to obtain the latter’s juicy retail order flow. I’ve been writing about this (and related) subjects for going on 20 years, and it’s complicated.

Internalization (and other trading in non-lit/exchange venues) reduces liquidity on exchanges, which raises trading costs there and reduces the informativeness of prices. Those factors are usually cited as criticism of off-exchange execution, but there are other considerations. Retail order flow (likely uninformed) gets executed more cheaply, as it should because it it less costly (due to the fact that it poses less of an adverse selection risk). (Who benefits from this cheaper execution is a matter of controversy.) Furthermore, as I pointed out in a 2002 Journal of Law, Economics and Organization paper, off-exchange venues provide competition for exchanges that often have market power (though this is less likely to be the case in post-RegNMS which made inter-exchange competition much more intense). Finally, some (and arguably a lot of) informed trading is rent seeking: by reducing the ability of informed traders to extract rents from uninformed traders, internalization (and dark markets) reduce the incentives to invest excessively in information collection (an incentive Hirshleifer the Elder noted in the 1970s).

Securities and derivatives market structure is fascinating, and it presents many interesting analytical challenges. But these markets, and the firms that operate in them, are not immune to the basic forces of innovation, imitation, and entry that economists have understood for a long time (but which too many have forgotten, alas). We are seeing those forces at work in real time, and the fates of firms like Interactive and Teza, and the HFT sector overall, are living illustrations.

 

Print Friendly

March 10, 2017

US Shale Puts the Saudis and OPEC in Zugzwang

Filed under: Commodities,Derivatives,Economics,Energy,Politics — The Professor @ 2:55 pm

This was CERA Week in Houston, and the Saudis and OPEC provided the comedic entertainment for the assembled oil industry luminaries.

It is quite evident that the speed and intensity of the U-turn in US oil production has unsettled the Saudis, and they don’t know quite what to do about it. So they were left with making empty threats.

My favorite was when Saudi Energy Minister Khalid al-Falih said there would be no “free rides” for US shale producers (and non-OPEC producers generally). Further, he said OPEC “will not bear the burden of free riders,” and “[w]e can’t do what we did in the ’80s and ’90s by swinging millions of barrels in response to market condition.”

Um, what is OPEC going to do about US free riders? Bomb the Permian? If it cuts output, and prices rise as a result, US E&P activity will pick up, and damn quick. The resulting replacement of a good deal of the OPEC output cut will limit the price impact thereof. The best place to be is outside a cartel that cuts output: you can get the benefit of the higher prices, and produce to the max. That’s what is happening in the US right now. OPEC has no credible way of showing off, or threatening to show off, free riders.

As for not doing what they did in the ’80s, well that’s exactly OPEC’s problem. It’s not the ’80s anymore. Now if it tries to “swing millions of barrels” to raise price, there is a fairly elastic and rapidly responding source of supply that can replace a large fraction of those barrels, thereby limiting the price impact of the OPEC swingers, baby.

Falih’s advisers were also trying to scare the US producers. Or something:

“One of the advisors said that OPEC would not take the hit for the rise in U.S. shale production,” a U.S. executive who was at the meeting told Reuters. “He said we and other shale producers should not automatically assume OPEC will extend the cuts.”

Presumably they are threatening a return to their predatory pricing strategy (euphemistically referred to as “defending market share”) that worked out so well for them the last time. Or perhaps it is just a concession that US supply is so elastic that it makes the demand for OPEC oil so elastic that output cuts are a losing proposition and will not endure. Either way, it means that OPEC is coming to the realization that continuing output cuts are unlikely to work. Meaning they won’t happen.

OPEC also floated cooperation with US producers on output. Mr. al-Falih, meet Senator Sherman! And if the antitrust laws didn’t make US participation in an agreement a non-starter, it would be almost impossible to cartelize the US industry given the largely free entry into E&P and the fungibility of technology, human capital, land, services, and labor. Maybe OPEC should hold talks with the Texas Railroad Commission instead.

Finally, in another laugh riot, OPEC canoodled with hedge funds. Apparently under the delusion that financial players play a material role in setting the price of physical barrels, rather than the price of risk. Disabling speculation could materially help OPEC only by raising the cost of hedging, which would tend to raise the costs of E&P firms, especially the more financially stretched ones. (Along these lines, I would argue that the big increase in net long speculative positions in recent months is not due to speculators pushing themselves into the market, but instead they have been pulled into the market by increased hedging activity that has occurred due to the increase in drilling activity in the US.)

Oil prices were down hard this week, from a $53 handle to a (at the time of this writing) $49.50 price. The first down-leg was due to the surprise spike in US inventories, but the continued weakness could well reflect the OPEC and Saudi messaging at CERA Week. The pathetic performance signaled deep strategic weakness, and suggests that the Saudis et al realize they are in zugzwang: regardless of what they do with regards to output, they are going to regret doing it.

My heart bleeds. Bleeds, I tells ya!

 

Print Friendly

February 20, 2017

Trolling Brent

Filed under: Commodities,Derivatives,Economics,Energy,Regulation — The Professor @ 10:14 am

Platts has announced the first major change in the Brent crude assessment process in a decade, adding Troll crude to the “Brent” stream:

A decline in supply from North Sea fields has led to concerns that physical volumes could become too thin and hence at times could be accumulated in the hands of just a few players, making the benchmark vulnerable to manipulation.

Platts said on Monday it would add Norway’s Troll crude to the four British and Norwegian crudes it already uses to assess dated Brent from Jan 1. 2018. This will join Brent, Forties, Oseberg and Ekofisk, or BFOE as they are known.

This is likely a stopgap measure, and Platts is considering more radical moves in the future:

It is also investigating a more radical plan to account for a possible larger drop-off in North Sea output over the next decade that would allow oil delivered from as far afield as west Africa and Central Asia to contribute to setting North Sea prices.

But the move is controversial, as this from the FT article shows:

If this is not addressed first, one source at a big North Sea trader said, the introduction of another grade to BFOE could make “an assessment that is unhedgeable, hence not fit for purpose”. “We don’t see any urgency to add grades today,” he added. Changes to Brent shifts the balance of power in North Sea trading. The addition of Troll makes Statoil the biggest contributor of supplies to the grades supporting Brent, overtaking Shell. Some big North Sea traders had expressed concern Statoil would have an advantage in understanding the balance of supply and demand in the region as it sends a large amount of Troll crude to its Mongstad refinery, Norway’s largest.

The statement about “an assessment that is unhedgeable, hence not fit for purpose” is BS, and exactly the kind of thing one always hears when contracts are redesigned. The fact is that contract redesigns have distributive effects, even if they improve a contract’s functioning, and the losers always whinge. Part of the distributive effect relates to issues like giving a company like Statoil an edge . . . that previously Shell and the other big North Sea producers had. But part of the distributive effect is that a contract with inadequate deliverable supply is a playground for big traders, who can more easily corner, squeeze, and hug such a contract.

Insofar as hedging is concerned, the main issue is how well the Brent contract performs as a hedge (and a pricing benchmark) for out-of-position (i.e., non-North Sea) crude, which represents the main use of Brent paper trades. Reducing deliverable supply constraints which contribute to pricing anomalies (and notably, anomalous moves in the basis) unambiguously improves the functioning of the contract for out-of-position players. Yeah, those hedging BFOE get slightly worse hedging performance, but that is a trivial consideration given that the very reason for changing the benchmark is the decline in BFOE production–which now represents less than 1 percent of world output. Why should the hair on the end of the tail wag the dog?

Insofar as the competition with WTI is concerned, the combination of larger US supplies, the construction of pipelines to move supplies from the Midcon (PADDII) to the Gulf (PADDIII)  and the lifting of the export ban have restored and in fact strengthened the connection of WTI prices to seaborne crude prices. US barrels are now going to both Europe and Asia, and US crude has effectively become the marginal barrel in most major markets, meaning that it is determining price and that WTI is an effective hedge (especially for the lighter grades). And by the way, the WTI delivery mechanism is much more robust and transparent than the baroque (and at times broken) Brent pricing mechanism.

As if to add an exclamation point to the story, Bloomberg reports that in recent months Shell has been bigfooting–or would that be trolling?–the market with big trades that have arguably distorted spreads. It got to the point that even firms like Vitol (which are notoriously loath to call foul, lest someone point fingers at them) raised the issue with Shell:

While none of those interviewed said Shell did anything illegal, they said the company violated the unspoken rules governing the market, which is lightly regulated. Executives of several trading rivals, including Vitol Group BV, the world’s top independent oil merchant, raised objections with counterparts at Shell last year, according to market participants.

What are the odds that Mr. Fit for Purpose is a Shell trader?

All of this is as I predicted, almost six years ago, when everyone was shoveling dirt on WTI and declaring Brent the Benchmark of the Forever Future:

Which means that those who are crowing about Brent today, and heaping scorn on WTI, will be begging for WTI’s problems in a few years.  For by then, WTI’s issues will be fixed, and it will be sitting astride a robust flow of oil tightly interconnected with the nexus of world oil trading.  But the Brent contract will be an inverted paper pyramid, resting on a thinner and thinner point of crude production.  There will be gains from trade–large ones–from redesigning the contract, but the difficulties of negotiating an agreement among numerous big players will prove nigh on to impossible to surmount.  Moreover, there will be no single regulator in a single jurisdiction that can bang heads together (for yes, that is needed sometimes) and cajole the parties toward agreement.

So Brent boosters, enjoy your laugh while it lasts.  It won’t last long, and remember, he who laughs last laughs best.

That’s exactly how things have worked out, even down to the point about the difficulties of getting the big boys to play together (a lesson gained through extensive personal experience, some of which is detailed in the post). Just call me Craignac the Magnificent. At least when it comes to commodity contract design 😉

Print Friendly

February 15, 2017

Never Argue From a Price Change, Oil Market Edition

Filed under: Commodities,Derivatives,Economics,Energy — The Professor @ 9:19 pm

In the FT, Greg Meyer ponders a puzzle: “A mystery is confounding the US oil market: when inventories rise, prices rise, too.”

Yes, it is normally the case that inventories and prices, and inventories and the spot-deferred spread, move in opposite directions. But this does not have to be the case.

The typical case is based on the following economic logic. Inventories respond mainly to current, and temporary, supply and demand shocks. If current demand falls, and this demand shock is anticipated to be temporary, then current availability rises relative to expected future availability. The efficient way to respond to this is to store more today because the commodity is abundant today relative to what is expected in the future, and efficient allocations move resources from where they are relatively abundant to where they are relatively scarce. Storage increases expected future availability, which depresses expected future prices. The nearby price must fall relative to the expected future price in order to encourage storage, and together the fall in the expected future price and the fall in the nearby price relative to the expected future price causes the nearby price to fall.

A similar story holds with respect to a temporary increase in current supply.

Parenthetically, the temporary nature of the shock is important in driving the change in storage because this causes a change in relative availability that is necessary to make it optimal to store more. A shock that is anticipated to persist does not change current availability relative to expected future availability, so there is no benefit to shifting resources through time via storage. A persistent shock causes a parallel shift (roughly) in the forward curve, and no change in storage. In my academic research, I show that in a dynamic storage model supply/demand shocks with a very short half-life (on the order of 30 days) drive storage behavior, and that very persistent shocks drive the overall level of prices.

But there are other kinds of shocks. One kind of shock is to anticipated future demand or supply. Let’s say supply is expected to decline in the future. This increase in expected future scarcity can be mitigated by storing more today (i.e., reducing current consumption). This spreads the effect of the anticipated future supply loss over time, and thereby smooths consumption in an efficient way. The only way to reduce current consumption in order to increase inventories is to increase the spot price. So in this scenario, (a) inventories and prices move in the same direction, and (b) inventories and calendar spread (deferred minus nearby) move in opposite directions in order to reward the higher amount of storage.

Here’s a real world example. The Energy Policy Act of 2005 mandated increased use of renewable fuels–notably ethanol–in future years. This caused an increase in anticipated future demand for corn used to produce ethanol. When the act was passed, the supply of corn was basically fixed. One way of responding to the expected increase in future corn demand was to store more immediately (thereby carrying current supplies into the future when demand was going to be higher). Given the fixed supply, the only way to achieve this higher storage (and hence reduced current consumption) was for prices to rise.

Therefore, one explanation for the positive co-movement between prices and inventories is a shock to the expected future supply/demand balance. For example, an increased likelihood that OPEC will extend its supply cuts beyond April could produce this result.

Another kind of shock that can lead to a positive co-movement between spot prices and inventories is a shock to supply/demand volatility: I discussed this in an early blog post, and later analyzed this formally in my 2011 book. (A good example of the synergy between blogging and rigorous research, BTW.)

The intuition is this. Inventories are a way of insuring against uncertainty: putting something aside for a rainy day, as it were. If fundamental economic uncertainty goes up, it is efficient to hold more inventory. Since supply is fixed in the short run, the only way to increase inventory is to reduce current consumption. The only way to increase current consumption is for spot prices to rise. Moreover, to compensate increased inventory holding, futures prices must rise relative to spot prices. Therefore, for this kind of shock (like a shock to future demand) the forward curve rises and becomes steeper (i.e., increased contango).

So although the positive co-movement between spot prices and inventory may be unusual, it can occur in a rational, efficient market. It depends on the underlying driving shock. The typical case occurs when shocks to current supply/demand dominate. The more unusual case occurs when the shocks are to expected future supply and demand, or to fundamental volatility.

This relates directly to something I mentioned in the “kill the economists” post yesterday. Specifically: never argue from a price change. It is necessary to understand what is causing the price change. When there are multiple shocks that can affect prices (e.g., supply and demand shocks; current or future shocks; shocks to supply/demand volatility as well as to the level of supply/demand), just looking at the pice movement is not sufficient to draw conclusions about either its effect, or its cause. Indeed, it is even misleading to talk about the “effect” of the price change, because the price change is itself the endogenous effect of underlying causes/shocks.

The usual way to sort out what is going on is to look at quantities as well as prices. For instance, in a simple supply-demand model if you see prices go down, that could be because supply rose or demand fell. You can figure out which only by observing quantity: if you see quantity fall, for instance, you know that a demand decline caused the movements.

This means that the recent co-movements in oil inventories and prices reflects market participants’ assessment that the supply/demand balance is expected to tighten in the future, or that fundamental uncertainty is going up, or both.

 

 

 

Print Friendly

February 11, 2017

Risk Gosplan Works Its Magic in Swaps Clearing

Filed under: Clearing,Commodities,Derivatives,Economics,Politics,Regulation — The Professor @ 4:18 pm

Deutsche Bank quite considerately provided a real time example of an unintended consequence of Frankendodd, specifically, capital requirements causing firms to exit from clearing. The bank announced it is continuing to provide futures clearing, but is exiting US swaps clearing, due to capital cost concerns.

Deutsch was not specific in citing the treatment of margins under the leverage ratio as the reason for its exit, this is the most likely culprit. Recall that even segregated margins (which a bank has no access to) are treated as bank assets under the leverage rule, so a swaps clearer must hold capital against assets over which it has no control (because all swap margins are segregated), cannot utilize to fund its own activities, and which are not funded by a liability issued by the clearer.

It’s perverse, and is emblematic of the mixed signals in Frankendodd: CLEAR SWAPS! CLEARING SWAPS  IS EXTREMELY CAPITAL INTENSIVE SO YOU WON’T MAKE ANY MONEY DOING IT! Yeah. That will work out swell.

Of course Deutsch Bank has its own issues, and because of those issues it faces more acute capital concerns than other institutions (especially American ones). But here is a case where the capital cost does not at all match up with risk (and remember that capital is intended to be a risk absorber). So looking for ways to economize on capital, Deutsch exited a business where the capital charge did not generate any commensurate return, and furthermore was unrelated to the actual risk of the business. If the pricing of risk had been more sensible, Deutsch might have scaled back other businesses where capital charges reflected risk more accurately. Here, the effect of the leverage ratio is all pain, no gain.

When interviewed by Risk Magazine about the Fundamental Review of the Trading Book, I said: “The FRTB’s standardised approach is basically central planning of risk pricing, and it will produce Gosplan-like results.” The leverage ratio, especially as applied to swaps margins, is another example of central planning of risk pricing, and here indeed it has produced Gosplan-like results.

And in the case of clearing, these results are exactly contrary to a crucial ostensible purpose of DFA: reducing size and concentration in banking generally, and in derivatives markets in particular. For as the FT notes:

The bank’s exit will reignite concerns that the swaps clearing business is too concentrated among a handful of large players. The top three swaps clearers account for more than half the market by client collateral required, while the top five account for over 75 per cent.

So swaps clearing is now hyper-concentrated, and dominated by a handful of systemically important banks (e.g., Citi, Goldman). It is more concentrated that the bilateral swaps dealer market was. Trouble at one of these dominant swaps clearers would create serious risks for CCPs that they clear for (which, by the way, are all interconnected because the same clearing members dominate all the major CCPs). Moreover, concentration dramatically reduces the benefits of mutualizing risk: because of the small number of clearers, the risk of a big CM failure will be borne by a small number of firms. This isn’t insurance in any meaningful way, and does not achieve the benefits of risk pooling even if only in the first instance only a single big clearing member runs into trouble due to a shock idiosyncratic to it.

At present, there is much gnashing of teeth and rending of garments at the prospect of even tweaks in Dodd-Frank. Evidently, the clearing mandate is not even on the table. But this one vignette demonstrates that Frankendodd and banking regulation generally is shot through with provisions intended to reduce systemic risk which do not have that effect, and indeed, likely have the perverse effect of creating some systemic risks. Viewing Dodd-Frank as a sacred cow and any proposed change to it as a threat to the financial system is utterly wrongheaded, and will lead to bad outcomes.

Barney and Chris did not come down Mount Sinai with tablets containing commandments written by the finger of God. They sat on Capitol Hill and churned out hundreds of pages of laws based on a cartoonish understanding of the financial system, information provided by highly interested parties, and a frequently false narrative of the financial crisis. These laws, in turn, have spawned thousands of pages of regulation, good, bad, and very ugly. What is happening in swaps clearing is very ugly indeed, and provides a great example of how major portions of Dodd-Frank and the regulations emanating from it need a thorough review and in some cases a major overhaul.

And if Elizabeth Warren loses her water over this: (a) so what else is new? and (b) good! Her Manichean view of financial regulation is a major impediment to getting the regulation right. What is happening in swaps clearing is a perfect illustration of why a major midcourse correction in the trajectory of financial regulation is imperative.

Print Friendly

Next Page »

Powered by WordPress