Streetwise Professor

April 14, 2017

SWP Climbs The Hill

Filed under: Derivatives,Economics,Exchanges,Regulation — The Professor @ 10:40 am

I have become a regular contributor to The Hill. My inaugural column on the regulation of spoofing is here. The argument in a nutshell is that: (a) spoofing involves large numbers of cancellations, but so do legitimate market making strategies, so there is a risk that aggressive policing of spoofing will wrongly penalize market makers, thereby raising the costs of supplying liquidity; (b) the price impacts of spoofing are very, very small, and transitory; (c) enforcement authorities sometimes fail to pursue manipulations that have far larger price impacts; therefore (d) a focus on spoofing is a misdirection of scarce enforcement resources.

My contributions will focus on finance and regulatory issues. So those looking for my trenchant political commentary will have to keep coming here 😉

Click early! Click often!

Print Friendly

April 4, 2017

The Unintended Consequences of Blockchain Are Not Unpredictable: Respond Now Rather Than Repent Later*

Filed under: Clearing,Commodities,Derivatives,Economics,Regulation — The Professor @ 3:39 pm

In the past week the WSJ and the FT have run articles about a new bank-led initiative to move commodity trading onto a blockchain. In many ways, this makes great sense. By its nature, the process of recording and trading commodity trades and shipments is (a) collectively involve large numbers of spatially dispersed counterparties, (b) have myriad terms, and (c) can give rise to costly disputes. As a result of these factors, the process is currently very labor intensive, fraught with operational risk (e.g., inadvertent errors) and vulnerable to fraud (cf., the Qingdao metals warehouse scandal of 2014). In theory, blockchain has the ability to reduce costs, errors, and fraud. Thus, it is understandable that traders and banks are quite keen on the potential of blockchain to reduce costs and perhaps even revolutionize the trading business.

But before you get too excited, a remark by my friend Christophe Salmon at Trafigura is latent with deep implications that should lead you to take pause and consider the likely consequences of widespread adoption of blockchain:

Christophe Salmon, Trafigura’s chief financial officer, said there would need to be widespread adoption by major oil traders and refiners to make blockchain in commodity trading viable in the long term.

This seemingly commonsense and innocuous remark is actually laden with implications of unintended consequences that should be recognized and considered now, before the blockchain train gets too far down the track.

In essence, Christophe’s remark means that to be viable blockchain has to scale. If it doesn’t scale, it won’t reduce cost. But if it does scale, a blockchain for a particular application is likely to be a natural monopoly, or at most a natural duopoly. (Issues of scope economies are also potentially relevant, but I’ll defer discussion of that for now.)

Indeed, if there are no technical impediments to scaling (which in itself is an open question–note the block size debate in Bitcoin), the “widespread adoption” feature that Christophe identifies as essential means that network effects create scale economies that are likely to result in the dominance of a single platform. Traders will want to record their business on the blockchain that their counterparties use. Since many trade with many, this creates a centripetal force that will tend to draw everyone to a single blockchain.

I can hear you say: “Well, if there is a public blockchain, that happens automatically because everyone has access to it.” But the nature of public blockchain means that it faces extreme obstacles that make it wildly impractical for commercial adoption on the scale being considered not just in commodity markets, but in virtually every aspect of the financial markets. Commercial blockchains will be centrally governed, limited access, private systems rather than a radically decentralized, open access, commons.

The “forking problem” alone is a difficulty. As demonstrated by Bitcoin in 2013 and Ethereum in 2016, public blockchains based on open source are vulnerable to “forking,” whereby uncoordinated changes in the software (inevitable in an open source system that lacks central governance and coordination) result in the simultaneous existence of multiple, parallel blockchains. Such forking would destroy the network economy/scale effects that make the idea of a single database attractive to commercial participants.

Prevention of forking requires central governance to coordinate changes in the code–something that offends the anarcho-libertarian spirits who view blockchain as a totally decentralized mechanism.

Other aspects of the pure version of an open, public blockchain make it inappropriate for most financial and commercial applications. For instance, public blockchain is touted because it does not require trust in the reputation of large entities such as clearing networks or exchanges. But the ability to operate without trust does not come for free.

Trust and reputation are indeed costly: as Becker and Stigler first noted decades ago, and others have formalized since, reputation is a bonding mechanism that requires the trusted entity to incur sunk costs that would be lost if it violates trust. (Alternatively, the trusted entity has to have market power–which is costly–that generates a stream of rents that is lost when trust is violated. That is, to secure trust prices have to be higher and output lower than would be necessary in a zero transactions cost world.)

But public blockchains have not been able to eliminate trust without cost. In Bitcoin, trust is replaced with “proof of work.” Well, work means cost. The blockchain mining industry consumes vast amounts of electricity and computing power in order to prove work. It is highly likely that the cost of creating trusted entities is lower than the cost of proof of work or alternative ways of eliminating the need for trust. Thus, a (natural monopoly) commercial blockchain is likely to have to be a trusted centralized institution, rather than a decentralized anarchist’s wet-dream.

Blockchain is also touted as permitting “smart contracts,” which automatically execute certain actions when certain pre-defined (and coded) contingencies are met. But “smart contracts” is not a synonym for “complete contracts,” i.e., contracts where every possible contingency is anticipated, and each party’s actions under each contingency is specified. Thus, even with smart (but incomplete) contracts, there will inevitably arise unanticipated contingencies.

Parties will have to negotiate what to do under these contingencies. Given that this will usually be a bilateral bargaining situation under asymmetric information, the bargaining will be costly and sometimes negotiations will break down. Moreover, under some contingencies the smart contracts will automatically execute actions that the parties do not expect and would like to change: here, self-execution prevents such contractual revisions, or at least makes them very difficult.

Indeed, it may be the execution of the contractual feature that first makes the parties aware that something has gone horribly wrong. Here another touted feature of pure blockchain–immutability–can become a problem. The revelation of information ex post may lead market participants to desire to change the terms of their contract. Can’t do that if the contracts are immutable.

Paper and ink contracts are inherently incomplete too, and this is why there are centralized mechanisms to address incompleteness. These include courts, but also, historically, bodies like stock or commodity exchanges, or merchants’ associations (in diamonds, for instance) have helped adjudicate disputes and to re-do deals that turn out to be inefficient ex post. The existence of institutions to facilitate the efficient adaption of parties to contractual incompleteness demonstrates that in the real world, man does not live (or transact) by contract alone.

Thus, the benefits of a mechanism for adjudicating and responding to contractual incompleteness create another reason for a centralized authority for blockchain, even–or especially–blockchains with smart contracts.

Further, the blockchain (especially with smart contracts) will be a complex interconnected system, in the technical sense of the term. There will be myriad possible interactions between individual transactions recorded on the system, and these interactions can lead to highly undesirable, and entirely unpredictable, outcomes. A centralized authority can greatly facilitate the response to such crises. (Indeed, years ago I posited this as one of the reasons for integration of exchanges and clearinghouses.)

And the connections are not only within a particular blockchain. There will be connections between blockchains, and between a blockchain and other parts of the financial system. Consider for example smart contracts that in a particular contingency dictate large cash flows (e.g., margin calls) from one group of participants to another. This will lead to a liquidity shock that will affect banks, funding markets, and liquidity supply mechanisms more broadly. Since the shock can be destabilizing and lead to actions that are individually rational but systemically destructive if uncoordinated, central coordination can improve efficiency and reduce the likelihood of a systemic crisis. That’s not possible with a radically decentralized blockchain.

I could go on, but you get the point: there are several compelling reasons for centralized governance of a commercial blockchain like that envisioned for commodity trading. Indeed, many of the features that attract blockchain devotees are bugs–and extremely nasty ones–in commercial applications, especially if adopted at large scale as is being contemplated. As one individual who works on commercializing blockchain told me: “Commercial applications of blockchain will strip out all of the features that the anarchists love about it.”

So step back for a minute. Christophe’s point about “widespread adoption” and an understanding of the network economies inherent in the financial and commercial applications of blockchain means that it is likely to be a natural monopoly in a particular application (e.g., physical oil trading) and likely across applications due to economies of scope (which plausibly exist because major market participants will transact in multiple segments, and because of the ability to use common coding across different applications, to name just two factors). Second, a totally decentralized, open access, public blockchain has numerous disadvantages in large-scale commercial applications: central governance creates value.

Therefore, commercial blockchains will be “permissioned” in the lingo of the business. That is, unlike public blockchain, entry will be limited to privileged members and their customers. Moreover, the privileged members will govern and control the centralized entity. It will be a private club, not a public commons. (And note that even the Bitcoin blockchain is not ungoverned. Everyone is equal, but the big miners–and there are now a relatively small number of big miners–are more equal than others. The Iron Law of Oligarchy applies in blockchain too.)

Now add another factor: the natural monopoly blockchain will likely not be contestible, for reasons very similar to the ones I have written about for years to demonstrate why futures and equity exchanges are typically natural monopolies that earn large rents because they are largely immune from competitive entry. Once a particular blockchain gets critical mass, there will be the lock-in problem from hell: a coordinated movement of a large set of users from the incumbent to a competitor will be necessary for the entrant to achieve the scale necessary to compete. This is difficult, if not impossible to arrange. Three Finger Brown could count the number of times that has happened in futures trading on his bad hand.

Now do you understand why banks are so keen on the blockchain? Yes, they couch it in terms of improving transactional efficiency, and it does that. But it also presents the opportunity to create monopoly financial market infrastructures that are immune from competitive entry. The past 50 years have seen an erosion of bank dominance–“disintermediation”–that has also eroded their rents. Blockchain gives the empire a chance to strike back. A coalition of banks (and note that most blockchain initiatives are driven by a bank-led cooperative, sometimes in partnership with a technology provider or providers) can form a blockchain for a particular application or applications, exploit the centripetal force arising from network effects, and gain a natural monopoly largely immune from competitive entry. Great work if you can get it. And believe me, the banks are trying. Very hard.

Left to develop on its own, therefore, the blockchain ecosystem will evolve to look like the exchange ecosystem of the 19th or early-20th centuries. Monopoly coalitions of intermediaries–“clubs” or “cartels”–offering transactional services, with member governance, and with the members reaping economic rents.

Right now regulators are focused on the technology, and (like many others) seem to be smitten with the potential of the technology to reduce certain costs and risks. They really need to look ahead and consider the market structure implications of that technology. Just as the natural monopoly nature of exchanges eventually led to intense disputes over the distribution of the benefits that they created, which in turn led to regulation (after bitter political battles), the fundamental economics of blockchain are likely to result in similar conflicts.

The law and regulation of blockchain is likely to be complicated and controversial precisely because natural monopoly regulation is inherently complicated and controversial. The yin and yang of financial infrastructure in particular is that the technology likely makes monopoly efficient, but also creates the potential for the exercise of market power (and, I might add, the exercise of political power to support and sustain market power, and to influence the distribution of rents that result from that market power). Better to think about those things now when things are still developing, than when the monopolies are developed, operating, and entrenched–and can influence the political and regulatory process, as monopolies are wont to do.

The digital economy is driven by network effects: think Google, Facebook, Amazon, and even Twitter. In addition to creating new efficiencies, these dominant platforms create serious challenges for competition, as scholars like Ariel Ezrachi and Maurice Stucke have shown:

Peter Thiel, the successful venture capitalist, famously noted that ‘Competition Is for Losers.’ That useful phrase captures the essence of many technology markets. Markets in which the winner of the competitive process is able to cement its position and protect it. Using data-driven network effects, it can undermine new entry attempts. Using deep pockets and the nowcasting radar, the dominant firm can purchase disruptive innovators.

Our new economy enables the winners to capture much more of the welfare. They are able to affect downstream competition as well as upstream providers. Often, they can do so with limited resistance from governmental agencies, as power in the online economy is not always easily captured using traditional competition analysis. Digital personal assistants, as we explore, have the potential to strengthen the winner’s gatekeeper power.

Blockchain will do the exact same thing.

You’ve been warned.

*My understanding of these issues has benefited greatly from many conversations over the past year with Izabella Kaminska, who saw through the hype well before pretty much anyone. Any errors herein are of course mine.

Print Friendly

March 27, 2017

Seeing the OTC Derivatives Markets (and the Financial Markets) Like a State

Filed under: Clearing,Derivatives,Economics,Regulation — The Professor @ 12:07 pm

In the years since the financial crisis, and in particular the period preceding and immediately following the passage of Frankendodd, I can’t tell you how many times I saw diagrams that looked like this:

YellenCCPDiagram_1

YellenCCPDiagram_2

The top diagram is a schematic representation of an OTC derivatives market, with a tangle of bilateral connections between counterparties. The second is a picture of a hub-and-spoke trading network with a CCP serving as the hub. (These particular versions of this comparison are from a 2013 Janet Yellen speech.)

These diagrams came to mind when re-reading James Scott’s Seeing Like a State and his Two Cheers for Anarchism. Scott argues that states have an obsession with making the societies they rule over “legible” in order to make them easier to tax, regulate, and control. States are confounded by evolved complexity and emergent orders: such systems are difficult to comprehend, and what cannot be comprehended cannot be easily ruled. So states attempt to impose schemes to simplify such complex orders. Examples that Scott gives include standardization of language and suppression of dialects; standardization of land tenure, measurements, and property rights; cadastral censuses; population censuses; the imposition of familial names; and urban renewal (e.g., Hausmann’s/Napoleon III’s massive reconstruction of Paris). These things make a populace easier to tax, conscript, and control.

Complex realities of emergent orders are too difficult to map. So states conceive of a mental map that is legible to them, and then impose rules on society to force it to conform with this mental map.

Looking back at the debate over OTC markets generally, and clearing, centralized execution, and trade reporting in particular, it is clear that legislators and regulators (including central banks) found these markets to be illegible. Figures like the first one–which are themselves a greatly simplified representation of OTC reality–were bewildering and disturbing to them. The second figure was much more comprehensible, and much more comforting: not just because they could comprehend it better, but because it gave them the sense that they could impose an order that would be easier to monitor and control. The emergent order was frightening in its wildness: the sense of imposing order and control was deeply comforting.

But as Scott notes, attempts to impose control on emergent orders (which in Scott’s books include both social and natural orders, e.g., forests) themselves carry great risks because although hard to comprehend, these orders evolved the way they did for a reason, and the parts interact in poorly understood–and sometimes completely not understood–ways. Attempts to make reality fit a simple mental map can cause the system to react in unpredicted and unpredictable ways, many of which are perverse.

My criticism of the attempts to “reform” OTC markets was largely predicated on my view that the regulators’ simple mental maps did great violence to complex reality. Even though these “reform” efforts were framed as ways of reducing systemic risk, they were fatally flawed because they were profoundly unsystemic in their understanding of the financial system. My critique focused specifically on the confident assertions based on the diagrams presented above. By focusing only on the OTC derivatives market, and ignoring the myriad connections of this market to other parts of the financial market, regulators could not have possibly comprehended the systemic implications of what they were doing. Indeed, even the portrayal of the OTC market alone was comically simplistic. The fallacy of composition played a role here too: the regulators thought they could reform the system piece-by-piece, without thinking seriously about how these pieces interacted in non-linear ways.

The regulators were guilty of the hubris illustrated beautifully by the parable of Chesterton’s Fence:

In the matter of reforming things, as distinct from deforming them, there is one plain and simple principle; a principle which will probably be called a paradox. There exists in such a case a certain institution or law; let us say, for the sake of simplicity, a fence or gate erected across a road. The more modern type of reformer goes gaily up to it and says, “I don’t see the use of this; let us clear it away.” To which the more intelligent type of reformer will do well to answer: “If you don’t see the use of it, I certainly won’t let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to destroy it.

In other words, the regulators should have understood the system and why it evolved the way that it did before leaping in to “reform” it. As Chesterton says, such attempts at reformation quite frequently result in deformation.

Somewhat belatedly, there are efforts underway to map the financial system more accurately. The work of Richard Bookstaber and various colleagues under the auspices of the Office of Financial Research to create multilayer maps of the financial system is certainly a vast improvement on the childish stick figure depictions of Janet Yellen, Gary Gensler, Timmy Geithner, Chris Dodd, Barney Frank et al. But even these more sophisticated maps are extreme abstractions, not least because they cannot capture incentives, the distribution of information among myriad market participants, and the motivations and behaviors of these participants. Think of embedding these maps in the most complicated extensive form large-N player game you can imagine, and you might have some inkling of how inadequate any schematic representation of the financial system is likely to be. When you combine this with the fact that in complex systems, even slight changes in initial conditions can result in completely different outcomes, the futility of “seeing like a state” in this context becomes apparent. The map of initial conditions is inevitably crude, making it an unreliable guide to understanding the system’s future behavior.

In my view, Scott goes too far. There is no doubt that some state-driven standardization has dramatically reduced transactions costs and opened up new possibilities for wealth-enhancing exchanges (at some cost, yes, but these costs are almost certainly less than the benefit), but Scott looks askance at virtually all such interventions. Thus, I do not exclude the possibility of true reform. But Scott’s warning about the dangers of forcing complex emergent orders to conform to simplified, “legible”, mental constructs must be taken seriously, and should inform any attempt to intervene in something like the financial system. Alas, this did not happen when legislators and regulators embarked on their crusade to reorganize wholesale the world financial system. It is frightening indeed to contemplate that this crusade was guided by such crude mental maps such as those supposedly illustrating the virtues of moving from an emergent bilateral OTC market to a tamed hub-and-spoke cleared one.

PS. I was very disappointed by this presentation by James Scott. He comes off as a doctrinaire leftist anthropologist (but I repeat myself), which is definitely not the case in his books. Indeed, the juxtaposition of Chesterton and Scott shows how deeply conservative Scott is (in the literal sense of the word).

Print Friendly

March 24, 2017

Creative Destruction and Industry Life Cycles, HFT Edition

Filed under: Derivatives,Economics,Exchanges,Regulation — The Professor @ 11:56 am

No worries, folks: I’m not dead! Just a little hiatus while in Geneva for my annual teaching gig at UniversitĂŠ de Genève, followed by a side trip for a seminar (to be released as a webinar) at ESSEC. The world didn’t collapse without my close attention, but at times it looked like a close run thing. But then again, I was restricted to watching CNN so my perception may be a little bit warped. Well, not a little bit: I have to say that I knew CNN was bad, but I didn’t know how bad until I watched a bit while on the road. Appalling doesn’t even come close to describing it. Strident, tendentious, unrelentingly biased, snide. I switched over to RT to get more reasonable coverage. Yes. It was that bad.

There are so many allegations regarding surveillance swirling about that only fools would rush in to comment on that now. I’ll be an angel for once in the hope that some actual verifiable facts come out.

So for my return, I’ll just comment on a set of HFT-related stories that came out during my trip. One is Alex Osipovich’s story on HFT traders falling on hard times. Another is that Virtu is bidding for KCG. A third one is that Quantlabs (a Houston outfit) is buying one-time HFT high flyer Teza. And finally, one that pre-dates my trip, but fits the theme: Thomas Peterffy’s Interactive Brokers Group is exiting options market making.

Alex’s story repeats Tabb Group data documenting a roughly 85 percent drop in HFT revenues in US equity trading. The Virtu-KCG proposed tie-up and the Quantlabs-Teza consummated one are indications of consolidation that is typical of maturing industries, and a shift it the business model of these firms. The Quantlabs-Teza story is particularly interesting. It suggests that it is no longer possible (or at least remunerative) to get a competitive edge via speed alone. Instead, the focus is shifting to extracting information from the vast flow of data generated in modern markets. Speed will matter here–he who analyzes faster, all else equal, will have an edge. But the margin for innovation will shift from hardware to data analytics software (presumably paired with specialized hardware optimized to use it).

None of these developments is surprising. They are part of the natural life cycle of a new industry. Indeed, I discussed this over two years ago:

In fact, HFT has followed the trajectory of any technological innovation in a highly competitive environment. At its inception, it was a dramatically innovative way of performing longstanding functions undertaken by intermediaries in financial markets: market making and arbitrage. It did so much more efficiently than incumbents did, and so rapidly it displaced the old-style intermediaries. During this transitional period, the first-movers earned supernormal profits because of cost and speed advantages over the old school intermediaries. HFT market share expanded dramatically, and the profits attracted expansion in the capital and capacity of the first-movers, and the entry of new firms. And as day follows night, this entry of new HFT capacity and the intensification of competition dissipated these profits. This is basic economics in action.

. . . .

Whether it is by the entry of a new destructively creative technology, or the inexorable forces of entry and expansion in a technologically static setting, one expects profits earned by firms in one wave of creative destruction to decline.  That’s what we’re seeing in HFT.  It was definitely a disruptive technology that reaped substantial profits at the time of its introduction, but those profits are eroding.

That shouldn’t be a surprise.  But it no doubt is to many of those who have made apocalyptic predictions about the machines taking over the earth.  Or the markets, anyways.

Or, as Herb Stein famously said as a caution against extrapolating from current trends, “If something cannot go on forever, it will stop.” Those making dire predictions about HFT were largely extrapolating from the events of 2008-2010, and ignored the natural economic forces that constrain growth and dissipate profits. HFT is now a normal, competitive business earning normal, competitive profits.  And hopefully this reality will eventually sink in, and the hysteria surrounding HFT will fade away just as its profits did.

The rise and fall of Peterffy/Interactive illustrates Schumpeterian creative destruction in action. Interactive was part of a wave of innovation that displaced the floor. Now it can’t compete against HFT. And as the other articles show, HFT is in the maturation stage during which profits are competed away (ironically, a phenomenon that was central to Marx’s analysis, and which Schumpeter’s theory was specifically intended to address).

This reminds me of a set of conversations I had with a very prominent trader. In the 1990s he said he was glad to see that the markets were becoming computerized because he was “tired of being fucked by the floor.” About 10 years later, he lamented to me how he was being “fucked by HFT.” Now HFT is an industry earning “normal” profits (in the economics lexicon) due to intensifying competition and technological maturation: the fuckers are fucking each other now, I guess.

One interesting public policy issue in the Peterffy story is the role played by internalization of order flow in undermining the economics of Interactive: there is also an internalization angle to the Virtu-KCG story, because one reason for Virtu to buy KCG is to obtain the latter’s juicy retail order flow. I’ve been writing about this (and related) subjects for going on 20 years, and it’s complicated.

Internalization (and other trading in non-lit/exchange venues) reduces liquidity on exchanges, which raises trading costs there and reduces the informativeness of prices. Those factors are usually cited as criticism of off-exchange execution, but there are other considerations. Retail order flow (likely uninformed) gets executed more cheaply, as it should because it it less costly (due to the fact that it poses less of an adverse selection risk). (Who benefits from this cheaper execution is a matter of controversy.) Furthermore, as I pointed out in a 2002 Journal of Law, Economics and Organization paper, off-exchange venues provide competition for exchanges that often have market power (though this is less likely to be the case in post-RegNMS which made inter-exchange competition much more intense). Finally, some (and arguably a lot of) informed trading is rent seeking: by reducing the ability of informed traders to extract rents from uninformed traders, internalization (and dark markets) reduce the incentives to invest excessively in information collection (an incentive Hirshleifer the Elder noted in the 1970s).

Securities and derivatives market structure is fascinating, and it presents many interesting analytical challenges. But these markets, and the firms that operate in them, are not immune to the basic forces of innovation, imitation, and entry that economists have understood for a long time (but which too many have forgotten, alas). We are seeing those forces at work in real time, and the fates of firms like Interactive and Teza, and the HFT sector overall, are living illustrations.

 

Print Friendly

February 25, 2017

Should Social Media Be Regulated as Common Carriers?

Filed under: Economics,Politics,Regulation — The Professor @ 6:43 pm

Major social media, notably Twitter and Facebook, are gradually moving to censor what is communicated on them. In Twitter’s case, the primary stated rationale is to “protect its users from abuse and harassment.” It has also taken upon itself to  “[identify] and [collapse] potentially abusive and low-quality replies so the most relevant conversations are brought forward.” There are widespread reports that Twitter engages in “shadowbanning”, i.e., hiding the Tweets of those users it identifies as objectionable, and making these Tweets inaccessible in searches.

Further, there are suspicions that there is a political and ideological component to the filters that Twitter applies, with conservative (and especially alt-right) content and users being more likely to fall afoul of these restrictions: the relentlessly leftist tilt of CEO Jack Dorsey (and most of its employees) gives considerable credence to these suspicions.

For its part, Facebook is pursuing ways to constrain users from posting what it deems as “misinformation” (aka “fake news”). This includes various measures such as cooperating with “third party fact-checking organizations“. Given the clear leftist tilt of Mark Zuckerberg and Facebook’s workforce, and the almost laughably leftist slant of the “fact-checkers”, there is also considerable reason for concern that the restrictions will not be imposed in a politically neutral way.

The off-the-top classical liberal/libertarian response to this is likely to be “well, this is unfortunate, but these are private corporations, and they can do what they want with their property.” But however superficially plausible this position appears to be, in fact there is a principled classical liberal/libertarian response that arrives at a very different conclusion. In particular, as arch-libertarian Richard Epstein (who styles himself as The Libertarian in his Hoover Institute podcast) has consistently pointed out, even during the heyday of small government, classical liberal government and law, the common law recognized that restrictions on the autonomy of certain entities was not only justifiable, but desirable. In particular, natural monopolies and near-monopolies were deemed to be “common carriers” upon whom the law imposed a duty of providing access on a non-discriminatory basis. The (classically liberal) common law of that era recognized that such entities could exercise market power, or engage in discriminatory conduct without fear of competitive check. Thus, the obligation to serve all on a non-discriminatory basis in order to constrain the exercise of market power, or invidious discrimination based on the preferences of the owner of the common carrier.

Major social media (and Google as well–perhaps most of all) clearly have market power, and the ability to discriminate without fear of losing business to competitors. The network nature of social media (and search engines) leads to the dominance of a small number of platforms, or even one platform. Yes, there are competitors to Facebook, Twitter, and Google, but these companies are clearly dominant in their spaces, and network effects make them largely immune to competitive entry. Imposition of a common carrier-inspired obligation to provide non-discriminatory access is therefore quite reasonable, and has a substantial economic and legal foundation. Thus, libertarians and classical liberals and conservatives and even fringe voices should not resign themselves to being second or third class citizens on social media, merely because these are private entities, rather than government ones. (Indeed, the analogy should go the other direction. A major reason for limiting the ability of the government to control speech is because of its monopoly of legal violence. It is monopoly power, regardless of whether in a market or political setting, that needs to be constrained through things like rights to free speech, or non-discriminatory access to common carriers.)

Further, insofar as leftists (including the managements of the major social media companies) are concerned, it is utterly incoherent for them to assert that as private entities they are perfectly free to restrict access according to their whims, given that leftists also adamantly (indeed, obnoxiously) insist that anti-discrimination laws should be imposed on small entities operating in highly competitive environments. Specifically, leftists believe that bakers or caterers or pizzarias with zero market power should be required to serve all, even if they have religious (or other) objections to doing so. But a baker refusing to sell a wedding cake to a gay couple does not meaningfully deprive said couple of the opportunity to get a cake: there are many other bakeries, and given the trivial costs of entry even if most incumbent bakers don’t want to serve gays, this only provides a commercial opportunity for entrant bakers to cater to the excluded clientele. Thus, discrimination by Baker A does not impose large costs on those s/he would prefer not to serve (even though forcing A to serve them might impose high costs on A, due to his/her sincere religious beliefs).

The same cannot be said of Twitter or Facebook. Given the nature of networks, social and otherwise, entrants or existing competitors are very poor substitutes for the dominant firms, which gives them the power to exclude, and which makes their exercise of this power extremely costly to the excluded.  In other words, if one believes that firms in highly competitive markets should be obligated to provide service/access to all on a non-discriminatory basis, one must concede that the Twitters, Facebooks, and Googles of the world should be similarly obligated, and that given their market power their conduct should be subject to a substantially higher degree of scrutiny than a small firm in a competitive market.

Of course, it is one thing to impose de jure an obligation on Twitter et al to provide equal access and equal treatment to all, regardless of political beliefs, and quite another to enforce it de facto. Of course Jack and Mark or Sergey don’t say “we discriminate against those holding contrary political opinions.” No, they couch their actions in terms of “protecting against abusive behavior and hate speech” or “stamping out disinformation.” But they retain the discretion to interpret what is abusive, hateful, and false–and it is clear that they consider much mainstream non-leftist belief as beyond the pale. Hence, enforcement of an open non-discriminatory access obligation would be difficult, and would inevitably involve estimation of discriminatory outcomes using statistical measures, a fraught exercise (as employment discrimination law demonstrates). Given the very deep pockets that these firms have, moreover, prevailing in a legal battle would be very difficult.

But this is a practical obstacle to treating social media like common carriers with a duty to provide non-discriminatory access. It is not a reason for classical liberals and libertarians to concede to dominant social network operators that they have an unrestricted right to restrict access as a matter of principle. In fact, the classical liberal/libertarian principle cuts quite the other way. And at the very least, imposing a common carrier-like obligation would substantially raise the cost that social network operators would pay to indulge in discrimination based on politics, beliefs, or ideology, and this could go a long way to make these places safe for the expression of political opinions that drive Jack, Mark, et al, nuts.

 

Print Friendly

February 20, 2017

Trolling Brent

Filed under: Commodities,Derivatives,Economics,Energy,Regulation — The Professor @ 10:14 am

Platts has announced the first major change in the Brent crude assessment process in a decade, adding Troll crude to the “Brent” stream:

A decline in supply from North Sea fields has led to concerns that physical volumes could become too thin and hence at times could be accumulated in the hands of just a few players, making the benchmark vulnerable to manipulation.

Platts said on Monday it would add Norway’s Troll crude to the four British and Norwegian crudes it already uses to assess dated Brent from Jan 1. 2018. This will join Brent, Forties, Oseberg and Ekofisk, or BFOE as they are known.

This is likely a stopgap measure, and Platts is considering more radical moves in the future:

It is also investigating a more radical plan to account for a possible larger drop-off in North Sea output over the next decade that would allow oil delivered from as far afield as west Africa and Central Asia to contribute to setting North Sea prices.

But the move is controversial, as this from the FT article shows:

If this is not addressed first, one source at a big North Sea trader said, the introduction of another grade to BFOE could make “an assessment that is unhedgeable, hence not fit for purpose”. “We don’t see any urgency to add grades today,” he added. Changes to Brent shifts the balance of power in North Sea trading. The addition of Troll makes Statoil the biggest contributor of supplies to the grades supporting Brent, overtaking Shell. Some big North Sea traders had expressed concern Statoil would have an advantage in understanding the balance of supply and demand in the region as it sends a large amount of Troll crude to its Mongstad refinery, Norway’s largest.

The statement about “an assessment that is unhedgeable, hence not fit for purpose” is BS, and exactly the kind of thing one always hears when contracts are redesigned. The fact is that contract redesigns have distributive effects, even if they improve a contract’s functioning, and the losers always whinge. Part of the distributive effect relates to issues like giving a company like Statoil an edge . . . that previously Shell and the other big North Sea producers had. But part of the distributive effect is that a contract with inadequate deliverable supply is a playground for big traders, who can more easily corner, squeeze, and hug such a contract.

Insofar as hedging is concerned, the main issue is how well the Brent contract performs as a hedge (and a pricing benchmark) for out-of-position (i.e., non-North Sea) crude, which represents the main use of Brent paper trades. Reducing deliverable supply constraints which contribute to pricing anomalies (and notably, anomalous moves in the basis) unambiguously improves the functioning of the contract for out-of-position players. Yeah, those hedging BFOE get slightly worse hedging performance, but that is a trivial consideration given that the very reason for changing the benchmark is the decline in BFOE production–which now represents less than 1 percent of world output. Why should the hair on the end of the tail wag the dog?

Insofar as the competition with WTI is concerned, the combination of larger US supplies, the construction of pipelines to move supplies from the Midcon (PADDII) to the Gulf (PADDIII)  and the lifting of the export ban have restored and in fact strengthened the connection of WTI prices to seaborne crude prices. US barrels are now going to both Europe and Asia, and US crude has effectively become the marginal barrel in most major markets, meaning that it is determining price and that WTI is an effective hedge (especially for the lighter grades). And by the way, the WTI delivery mechanism is much more robust and transparent than the baroque (and at times broken) Brent pricing mechanism.

As if to add an exclamation point to the story, Bloomberg reports that in recent months Shell has been bigfooting–or would that be trolling?–the market with big trades that have arguably distorted spreads. It got to the point that even firms like Vitol (which are notoriously loath to call foul, lest someone point fingers at them) raised the issue with Shell:

While none of those interviewed said Shell did anything illegal, they said the company violated the unspoken rules governing the market, which is lightly regulated. Executives of several trading rivals, including Vitol Group BV, the world’s top independent oil merchant, raised objections with counterparts at Shell last year, according to market participants.

What are the odds that Mr. Fit for Purpose is a Shell trader?

All of this is as I predicted, almost six years ago, when everyone was shoveling dirt on WTI and declaring Brent the Benchmark of the Forever Future:

Which means that those who are crowing about Brent today, and heaping scorn on WTI, will be begging for WTI’s problems in a few years.  For by then, WTI’s issues will be fixed, and it will be sitting astride a robust flow of oil tightly interconnected with the nexus of world oil trading.  But the Brent contract will be an inverted paper pyramid, resting on a thinner and thinner point of crude production.  There will be gains from trade–large ones–from redesigning the contract, but the difficulties of negotiating an agreement among numerous big players will prove nigh on to impossible to surmount.  Moreover, there will be no single regulator in a single jurisdiction that can bang heads together (for yes, that is needed sometimes) and cajole the parties toward agreement.

So Brent boosters, enjoy your laugh while it lasts.  It won’t last long, and remember, he who laughs last laughs best.

That’s exactly how things have worked out, even down to the point about the difficulties of getting the big boys to play together (a lesson gained through extensive personal experience, some of which is detailed in the post). Just call me Craignac the Magnificent. At least when it comes to commodity contract design 😉

Print Friendly

February 14, 2017

“First, Kill All the Economists!” Sounds Great to Some, But It Won’t Fix Monetary Policy

Filed under: Economics,Financial crisis,Financial Crisis II,History,Regulation — The Professor @ 9:00 pm

A former advisor to the Dallas Fed has penned a book blasting the Fed for being ruled by a “tribe” of insular egghead economics PhDs:

In her book, Ms. Booth describes a tribe of slow-moving Fed economists who dismiss those without high-level academic credentials. She counts Fed Chairwoman Janet Yellen and former Fed leader Ben Bernanke among them. The Fed, Mr. Bernanke and the Dallas Fed declined to comment.

The Fed’s “modus operandi” is defined by “hubris and myopia,” Ms. Booth writes in an advance copy of the book. “Central bankers have invited politicians to abdicate leadership authority to an inbred society of PhD academics who are infected to their core with groupthink, or as I prefer to think of it: ‘groupstink.’”

“Global systemic risk has been exponentially amplified by the Fed’s actions,” Ms. Booth writes, referring to the central bank’s policies holding interest rates very low since late 2008. “Who will pay when this credit bubble bursts? The poor and middle class, not the elites.”

Ms. Booth is an acolyte of her former boss, Dallas Fed chair Richard Fisher, who said “If you rely entirely on theory, you are not going to conduct the right policy, because policies have consequences.”

I have very mixed feelings about this. There is no doubt that under the guidance of academics, including (but not limited to) Ben Bernanke, that the Fed has made some grievous errors. But it is a false choice to claim that Practical People can do better without a coherent theoretical framework. For what is the alternative to theory? Heuristics? Rules of thumb? Experience?

Two thinkers usually in conflict–Keynes and Hayek– were of of one mind on this issue. Keynes famously wrote:

Practical men who believe themselves to be quite exempt from any intellectual influence, are usually the slaves of some defunct economist. Madmen in authority, who hear voices in the air, are distilling their frenzy from some academic scribbler of a few years back.

For his part, Hayek said “without a theory the facts are silent.”

Everybody–academic economist or no–is beholden to some theory or another. It is a conceit of non-academics to believe that they are “exempt from any intellectual influence.” Indeed, the advantage of following an explicit theoretical framework is that its assumptions and implications are transparent and (usually) testable, and therefore can be analyzed, challenged, and improved. An inchoate and largely informal “practical” mindset (which often is a hodgepodge of condensed academic theories) is far more amorphous and difficult to understand or challenge. (Talk to a trader about monetary policy sometime if you doubt me.)

Indeed, Ms. Booth gives evidence of this. Many have been prophesying doom as a result of the Fed’s (and the ECB’s) post-2008 policies: Ms. Booth is among them. I will confess to have harbored such concerns, and indeed, challenged Ben Bernanke on this at a Fed conference on Jekyll Island in May, 2009. It may happen sometime, and I believe that ZIRP has indeed distorted the economy, but my fears (and Ms. Booth’s) have not been realized in eight plus years.

Ms. Booth’s critique of pre-crisis Fed policy is also predicated on a particular theoretical viewpoint, namely, that the Fed fueled a credit bubble prior to the Crash. But as scholars as diverse as Scott Sumner and John Taylor have argued, Fed policy was actually too tight prior to the crisis.

Along these lines, one could argue that the Fed’s most egregious errors are not the consequence of deep DSGE theorizing, but instead result from the use of rules of thumb and a failure to apply basic economics. As Scott Sumner never tires of saying (and sadly, must keep repeating because those who are slaves to the rule of thumb are hard of hearing and learning) the near universal practice of using interest rates as a measure of the state of monetary policy is a category error: befitting a Chicago trained economist, Scott cautions never argue from a price change, but look for the fundamental supply and demand forces that cause a price (e.g., an interest rate to be high or low). (As a Chicago guy, I have been beating the same drum for more than 30 years.)

And some historical perspective is in order. The Fed’s history is a litany of fumbles, some relatively minor, others egregious. Blame for the Great Depression and the Great Inflation can be laid directly at the Fed’s feet. Its most notorious failings were not driven by the prevailing academic fashion, but occurred under the leadership of practical people, mainly people with a banking background,  who did quite good impressions of madmen in authority. Ms. Booth bewails the “hubris of Ph.D. economists who’ve never worked on the Street or in the City,” but people who have worked there have screwed up monetary policy when they’ve been in charge.

As tempting as it may sound, “First, kill all the economists!” is not a prescription for better monetary policy. Economists may succumb to hubris (present company excepted, of course!) but the real hubris is rooted in the belief that central banks can overcome the knowledge problem, and can somehow manage entire economies (and the stability of the financial system). Hayek pointedly noted the “fatal conceit” of central planning. That conceit is inherent in central banking, too, and is not limited to professionally trained economists. Indeed, I would venture that academics are less vulnerable to it.

The problem, therefore, is not who captains the monetary ship. The question is whether anyone is capable of keeping such a huge and unwieldy vessel off the shoals. Experience–and theory!–suggests no.

 

Print Friendly

February 11, 2017

Risk Gosplan Works Its Magic in Swaps Clearing

Filed under: Clearing,Commodities,Derivatives,Economics,Politics,Regulation — The Professor @ 4:18 pm

Deutsche Bank quite considerately provided a real time example of an unintended consequence of Frankendodd, specifically, capital requirements causing firms to exit from clearing. The bank announced it is continuing to provide futures clearing, but is exiting US swaps clearing, due to capital cost concerns.

Deutsch was not specific in citing the treatment of margins under the leverage ratio as the reason for its exit, this is the most likely culprit. Recall that even segregated margins (which a bank has no access to) are treated as bank assets under the leverage rule, so a swaps clearer must hold capital against assets over which it has no control (because all swap margins are segregated), cannot utilize to fund its own activities, and which are not funded by a liability issued by the clearer.

It’s perverse, and is emblematic of the mixed signals in Frankendodd: CLEAR SWAPS! CLEARING SWAPS  IS EXTREMELY CAPITAL INTENSIVE SO YOU WON’T MAKE ANY MONEY DOING IT! Yeah. That will work out swell.

Of course Deutsch Bank has its own issues, and because of those issues it faces more acute capital concerns than other institutions (especially American ones). But here is a case where the capital cost does not at all match up with risk (and remember that capital is intended to be a risk absorber). So looking for ways to economize on capital, Deutsch exited a business where the capital charge did not generate any commensurate return, and furthermore was unrelated to the actual risk of the business. If the pricing of risk had been more sensible, Deutsch might have scaled back other businesses where capital charges reflected risk more accurately. Here, the effect of the leverage ratio is all pain, no gain.

When interviewed by Risk Magazine about the Fundamental Review of the Trading Book, I said: “The FRTB’s standardised approach is basically central planning of risk pricing, and it will produce Gosplan-like results.” The leverage ratio, especially as applied to swaps margins, is another example of central planning of risk pricing, and here indeed it has produced Gosplan-like results.

And in the case of clearing, these results are exactly contrary to a crucial ostensible purpose of DFA: reducing size and concentration in banking generally, and in derivatives markets in particular. For as the FT notes:

The bank’s exit will reignite concerns that the swaps clearing business is too concentrated among a handful of large players. The top three swaps clearers account for more than half the market by client collateral required, while the top five account for over 75 per cent.

So swaps clearing is now hyper-concentrated, and dominated by a handful of systemically important banks (e.g., Citi, Goldman). It is more concentrated that the bilateral swaps dealer market was. Trouble at one of these dominant swaps clearers would create serious risks for CCPs that they clear for (which, by the way, are all interconnected because the same clearing members dominate all the major CCPs). Moreover, concentration dramatically reduces the benefits of mutualizing risk: because of the small number of clearers, the risk of a big CM failure will be borne by a small number of firms. This isn’t insurance in any meaningful way, and does not achieve the benefits of risk pooling even if only in the first instance only a single big clearing member runs into trouble due to a shock idiosyncratic to it.

At present, there is much gnashing of teeth and rending of garments at the prospect of even tweaks in Dodd-Frank. Evidently, the clearing mandate is not even on the table. But this one vignette demonstrates that Frankendodd and banking regulation generally is shot through with provisions intended to reduce systemic risk which do not have that effect, and indeed, likely have the perverse effect of creating some systemic risks. Viewing Dodd-Frank as a sacred cow and any proposed change to it as a threat to the financial system is utterly wrongheaded, and will lead to bad outcomes.

Barney and Chris did not come down Mount Sinai with tablets containing commandments written by the finger of God. They sat on Capitol Hill and churned out hundreds of pages of laws based on a cartoonish understanding of the financial system, information provided by highly interested parties, and a frequently false narrative of the financial crisis. These laws, in turn, have spawned thousands of pages of regulation, good, bad, and very ugly. What is happening in swaps clearing is very ugly indeed, and provides a great example of how major portions of Dodd-Frank and the regulations emanating from it need a thorough review and in some cases a major overhaul.

And if Elizabeth Warren loses her water over this: (a) so what else is new? and (b) good! Her Manichean view of financial regulation is a major impediment to getting the regulation right. What is happening in swaps clearing is a perfect illustration of why a major midcourse correction in the trajectory of financial regulation is imperative.

Print Friendly

February 4, 2017

The Regulatory Road to Hell

One of the most encouraging aspects of the new administration is its apparent commitment to rollback a good deal of regulation. Pretty much the entire gamut of regulation is under examination, and even Trump’s nominee for the Supreme Court, Neil Gorsuch, represents a threat to the administrative state due to his criticism of Chevron Deference (under which federal courts are loath to question the substance of regulations issued by US agencies).

The coverage of the impending regulatory rollback is less that informative, however. Virtually every story about a regulation under threat frames the issue around the regulation’s intent. The Fiduciary Rule “requires financial advisers to act in the best interests of their clients.” The Stream Protection Rule prevents companies from “dumping mining waste into streams and waterways.” The SEC rule on reporting of payments to foreign governments by energy and minerals firms “aim[s] to address the ‘resource curse,’ in which oil and mineral wealth in resource-rich countries flows to government officials and the upper classes, rather than to low-income people.” Dodd-Frank is intended prevent another financial crisis. And on and on.

Who could be against any of these things, right? This sort of framing therefore makes those questioning the regulations out to be ogres, or worse, favoring financial skullduggery, rampant pollution, bribery and corruption, and reckless behavior that threatens the entire economy.

But as the old saying goes, the road to hell is paved with good intentions, and that is definitely true of regulation. Regulations often have unintended consequences–many of which are directly contrary to the stated intent. Furthermore, regulations entail costs as well as benefits, and just focusing on the benefits gives a completely warped understanding of the desirability of a regulation.

Take Frankendodd. It is bursting with unintended consequences. Most notably, quite predictably (and predicted here, early and often) the huge increase in regulatory overhead actually favors consolidation in the financial sector, and reinforces the TBTF problem. It also has been devastating to smaller community banks.

DFA also works at cross purposes. Consider the interaction between the leverage ratio, which is intended to insure that banks are sufficiently capitalized, and the clearing mandate, which is intended to reduce systemic risk arising from the derivatives markets. The interpretation of the leverage ratio (notably, treating customer margins held by FCMs as an FCM asset which increases the amount of capital it must hold due to the leverage ratio) makes offering clearing services more expensive. This is exacerbating the marked consolidation among FCMs, which is contrary to the stated purpose of Dodd-Frank. Moreover, it means that some customers will not be able to find clearing firms, or will find using derivatives to manage risk prohibitively expensive. This undermines the ability of the derivatives markets to allocate risk efficiently.

Therefore, to describe regulations by their intentions, rather than their effects, is highly misleading. Many of the effects are unintended, and directly contrary to the explicit intent.

One of the effects of regulation is that they impose costs, both direct and indirect.  A realistic appraisal of regulation requires a thorough evaluation of both benefits and costs. Such evaluations are almost completely lacking in the media coverage, except to cite some industry source complaining about the cost burden. But in the context of most articles, this comes off as special pleading, and therefore suspect.

Unfortunately, much cost benefit analysis–especially that carried out by the regulatory agencies themselves–is a bad joke. Indeed, since the agencies in question often have an institutional or ideological interest in their regulations, their “analyses” should be treated as a form of special pleading of little more reliability than the complaints of the regulated. The proposed position limits regulation provides one good example of this. Costs are defined extremely narrowly, benefits very broadly. Indirect impacts are almost completely ignored.

As another example, Tyler Cowen takes a look into the risible cost benefit analysis behind the Stream Protection Rule, and finds it seriously wanting. Even though he is sympathetic to the goals of the regulation, and even to the largely tacit but very real meta-intent (reducing the use of coal in order to advance  the climate change agenda), he is repelled by the shoddiness of the analysis.

Most agency cost benefit analysis is analogous to asking pupils to grade their own work, and gosh darn it, wouldn’t you know, everybody’s an A student!

This is particularly problematic under Chevron Deference, because courts seldom evaluate the substance of the regulations or the regulators’ analyses. There is no real judicial check and balance on regulators.

The metastasizing regulatory and administrative state is a very real threat to economic prosperity and growth, and to individual freedom. The lazy habit of describing regulations and regulators by their intent, rather than their effects, shields them from the skeptical scrutiny that they deserve, and facilitates this dangerous growth. If the Trump administration and Congress proceed with their stated plans to pare back the Obama administration’s myriad and massive regulatory expansion, this intent-focused coverage will be one of the biggest obstacles that they will face.  The media is the regulators’ most reliable paving contractor  for the highway to hell.

Print Friendly

January 24, 2017

Two Contracts With No Future

Filed under: China,Commodities,Derivatives,Economics,Energy,Exchanges,Politics,Regulation — The Professor @ 7:14 pm

Over the past couple of days two major futures exchanges have pulled the plug on contracts. I predicted these outcomes when the contracts were first announced, and the reasons I gave turned out to be the reasons given for the decisions.

First, the CME announced that it is suspending trading in its new cocoa contract, due to lack of volume/liquidity. I analyzed that contract here. This is just another example of failed entry by a futures contract. Not really news.

Second, the Shanghai Futures Exchange has quietly shelved plans to launch a China-based oil contract. When it was first mooted, I expressed extreme skepticism, due mainly to China’s overwhelming tendency to intervene in markets sending the wrong signal–wrong from the government’s perspective that is:

Then the crash happened, and China thrashed around looking for scapegoats, and rounded up the usual suspects: Speculators! And it suspected that the CSI 300 Index and CSI 500 Index futures contracts were the speculators’ weapons of mass destruction of choice. So it labeled trades of bigger than 10 (!) contracts “abnormal”–and we know what happens to people in China who engage in unnatural financial practices! It also increased fees four-fold, and bumped up margin requirements.

The end result? Success! Trading volumes declined 99 percent. You read that right. 99 percent. Speculation problem solved! I’m guessing that the fear of prosecution for financial crimes was by far the biggest contributor to that drop.

. . . .

And the crushing of the CSI300 and CSI500 contracts will impede development of a robust oil futures market. The brutal killing of these contracts will make market participants think twice about entering positions in a new oil futures contract, especially long dated ones (which are an important part of the CME/NYMEX and ICE markets). Who wants to get into a position in a market that may be all but shut down when the market sends the wrong message? This could be the ultimate roach motel: traders can check in, but they can’t check out. Or the Chinese equivalent of Hotel California: traders can check in, but they can never leave. So traders will be reluctant to check in in the first place. Ironically, moreover, this will encourage the in-and-out day trading that the Chinese authorities say that they condemn: you can’t get stuck in a position if you don’t hold a position.

In other words, China has a choice. It can choose to allow markets to operate in fair economic weather or foul, and thereby encourage the growth of robust contracts in oil or equities. Or it can choose to squash markets during economic storms, and impede their development even in good times.

I do not see how, given the absence of the rule of law and the just-demonstrated willingness to intervene ruthlessly, that China can credibly commit to a policy of non-intervention going forward. And because of this, it will stunt the development of its financial markets, and its economic growth. Unfettered power and control have a price. [Emphasis added.]

And that’s exactly what has happened. Per Reuters’ Clyde Russell:

The quiet demise of China’s plans to launch a new crude oil futures contract shows the innate conflict of wanting the financial clout that comes with being the world’s biggest commodity buyer, but also seeking to control the market.

. . . .

The main issues were concerns by international players about trading in yuan, given issues surrounding convertibility back to dollars, and also the risks associated with regulation in China.

The authorities in Beijing have established a track record of clamping down on commodity trading when they feel the market pricing is driven by speculation and has become divorced from supply and demand fundamentals.

On several occasions last year, the authorities took steps to crack down on trading in then hot commodities such as iron ore, steel and coal.

While these measures did have some success in cooling markets, they are generally anathema to international traders, who prefer to accept the risk of rapid reversals in order to enjoy the benefits of strong rallies.

It’s likely that while the INE could design a crude futures contract that would on paper tick all the right boxes, it would battle to overcome the trust deficit that exists between the global financial community and China.

What international banks and trading houses will want to see before they throw their weight behind a new futures contract is evidence that Beijing won’t interfere in the market to achieve outcomes in line with its policy goals.

It will be hard, but not impossible, to guarantee this, with the most plausible solution being the establishment of some sort of free trade zone in which the futures contract could be legally housed.

Don’t hold your breath.

It is also quite interesting to contemplate this after all the slobbering over Xi’s Davos speech. China is protectionist and has an overwhelming predilection to intervene in markets when they don’t give the outcomes desired by the government/Party. It is not going to be a leader in openness and markets. Anybody whose obsession with Trump leads them to ignore this fundamental fact is truly a moron.

 

 

Print Friendly

« Previous PageNext Page »

Powered by WordPress