Streetwise Professor

April 14, 2017

SWP Climbs The Hill

Filed under: Derivatives,Economics,Exchanges,Regulation — The Professor @ 10:40 am

I have become a regular contributor to The Hill. My inaugural column on the regulation of spoofing is here. The argument in a nutshell is that: (a) spoofing involves large numbers of cancellations, but so do legitimate market making strategies, so there is a risk that aggressive policing of spoofing will wrongly penalize market makers, thereby raising the costs of supplying liquidity; (b) the price impacts of spoofing are very, very small, and transitory; (c) enforcement authorities sometimes fail to pursue manipulations that have far larger price impacts; therefore (d) a focus on spoofing is a misdirection of scarce enforcement resources.

My contributions will focus on finance and regulatory issues. So those looking for my trenchant political commentary will have to keep coming here 😉

Click early! Click often!

Print Friendly

April 4, 2017

The Unintended Consequences of Blockchain Are Not Unpredictable: Respond Now Rather Than Repent Later*

Filed under: Clearing,Commodities,Derivatives,Economics,Regulation — The Professor @ 3:39 pm

In the past week the WSJ and the FT have run articles about a new bank-led initiative to move commodity trading onto a blockchain. In many ways, this makes great sense. By its nature, the process of recording and trading commodity trades and shipments is (a) collectively involve large numbers of spatially dispersed counterparties, (b) have myriad terms, and (c) can give rise to costly disputes. As a result of these factors, the process is currently very labor intensive, fraught with operational risk (e.g., inadvertent errors) and vulnerable to fraud (cf., the Qingdao metals warehouse scandal of 2014). In theory, blockchain has the ability to reduce costs, errors, and fraud. Thus, it is understandable that traders and banks are quite keen on the potential of blockchain to reduce costs and perhaps even revolutionize the trading business.

But before you get too excited, a remark by my friend Christophe Salmon at Trafigura is latent with deep implications that should lead you to take pause and consider the likely consequences of widespread adoption of blockchain:

Christophe Salmon, Trafigura’s chief financial officer, said there would need to be widespread adoption by major oil traders and refiners to make blockchain in commodity trading viable in the long term.

This seemingly commonsense and innocuous remark is actually laden with implications of unintended consequences that should be recognized and considered now, before the blockchain train gets too far down the track.

In essence, Christophe’s remark means that to be viable blockchain has to scale. If it doesn’t scale, it won’t reduce cost. But if it does scale, a blockchain for a particular application is likely to be a natural monopoly, or at most a natural duopoly. (Issues of scope economies are also potentially relevant, but I’ll defer discussion of that for now.)

Indeed, if there are no technical impediments to scaling (which in itself is an open question–note the block size debate in Bitcoin), the “widespread adoption” feature that Christophe identifies as essential means that network effects create scale economies that are likely to result in the dominance of a single platform. Traders will want to record their business on the blockchain that their counterparties use. Since many trade with many, this creates a centripetal force that will tend to draw everyone to a single blockchain.

I can hear you say: “Well, if there is a public blockchain, that happens automatically because everyone has access to it.” But the nature of public blockchain means that it faces extreme obstacles that make it wildly impractical for commercial adoption on the scale being considered not just in commodity markets, but in virtually every aspect of the financial markets. Commercial blockchains will be centrally governed, limited access, private systems rather than a radically decentralized, open access, commons.

The “forking problem” alone is a difficulty. As demonstrated by Bitcoin in 2013 and Ethereum in 2016, public blockchains based on open source are vulnerable to “forking,” whereby uncoordinated changes in the software (inevitable in an open source system that lacks central governance and coordination) result in the simultaneous existence of multiple, parallel blockchains. Such forking would destroy the network economy/scale effects that make the idea of a single database attractive to commercial participants.

Prevention of forking requires central governance to coordinate changes in the code–something that offends the anarcho-libertarian spirits who view blockchain as a totally decentralized mechanism.

Other aspects of the pure version of an open, public blockchain make it inappropriate for most financial and commercial applications. For instance, public blockchain is touted because it does not require trust in the reputation of large entities such as clearing networks or exchanges. But the ability to operate without trust does not come for free.

Trust and reputation are indeed costly: as Becker and Stigler first noted decades ago, and others have formalized since, reputation is a bonding mechanism that requires the trusted entity to incur sunk costs that would be lost if it violates trust. (Alternatively, the trusted entity has to have market power–which is costly–that generates a stream of rents that is lost when trust is violated. That is, to secure trust prices have to be higher and output lower than would be necessary in a zero transactions cost world.)

But public blockchains have not been able to eliminate trust without cost. In Bitcoin, trust is replaced with “proof of work.” Well, work means cost. The blockchain mining industry consumes vast amounts of electricity and computing power in order to prove work. It is highly likely that the cost of creating trusted entities is lower than the cost of proof of work or alternative ways of eliminating the need for trust. Thus, a (natural monopoly) commercial blockchain is likely to have to be a trusted centralized institution, rather than a decentralized anarchist’s wet-dream.

Blockchain is also touted as permitting “smart contracts,” which automatically execute certain actions when certain pre-defined (and coded) contingencies are met. But “smart contracts” is not a synonym for “complete contracts,” i.e., contracts where every possible contingency is anticipated, and each party’s actions under each contingency is specified. Thus, even with smart (but incomplete) contracts, there will inevitably arise unanticipated contingencies.

Parties will have to negotiate what to do under these contingencies. Given that this will usually be a bilateral bargaining situation under asymmetric information, the bargaining will be costly and sometimes negotiations will break down. Moreover, under some contingencies the smart contracts will automatically execute actions that the parties do not expect and would like to change: here, self-execution prevents such contractual revisions, or at least makes them very difficult.

Indeed, it may be the execution of the contractual feature that first makes the parties aware that something has gone horribly wrong. Here another touted feature of pure blockchain–immutability–can become a problem. The revelation of information ex post may lead market participants to desire to change the terms of their contract. Can’t do that if the contracts are immutable.

Paper and ink contracts are inherently incomplete too, and this is why there are centralized mechanisms to address incompleteness. These include courts, but also, historically, bodies like stock or commodity exchanges, or merchants’ associations (in diamonds, for instance) have helped adjudicate disputes and to re-do deals that turn out to be inefficient ex post. The existence of institutions to facilitate the efficient adaption of parties to contractual incompleteness demonstrates that in the real world, man does not live (or transact) by contract alone.

Thus, the benefits of a mechanism for adjudicating and responding to contractual incompleteness create another reason for a centralized authority for blockchain, even–or especially–blockchains with smart contracts.

Further, the blockchain (especially with smart contracts) will be a complex interconnected system, in the technical sense of the term. There will be myriad possible interactions between individual transactions recorded on the system, and these interactions can lead to highly undesirable, and entirely unpredictable, outcomes. A centralized authority can greatly facilitate the response to such crises. (Indeed, years ago I posited this as one of the reasons for integration of exchanges and clearinghouses.)

And the connections are not only within a particular blockchain. There will be connections between blockchains, and between a blockchain and other parts of the financial system. Consider for example smart contracts that in a particular contingency dictate large cash flows (e.g., margin calls) from one group of participants to another. This will lead to a liquidity shock that will affect banks, funding markets, and liquidity supply mechanisms more broadly. Since the shock can be destabilizing and lead to actions that are individually rational but systemically destructive if uncoordinated, central coordination can improve efficiency and reduce the likelihood of a systemic crisis. That’s not possible with a radically decentralized blockchain.

I could go on, but you get the point: there are several compelling reasons for centralized governance of a commercial blockchain like that envisioned for commodity trading. Indeed, many of the features that attract blockchain devotees are bugs–and extremely nasty ones–in commercial applications, especially if adopted at large scale as is being contemplated. As one individual who works on commercializing blockchain told me: “Commercial applications of blockchain will strip out all of the features that the anarchists love about it.”

So step back for a minute. Christophe’s point about “widespread adoption” and an understanding of the network economies inherent in the financial and commercial applications of blockchain means that it is likely to be a natural monopoly in a particular application (e.g., physical oil trading) and likely across applications due to economies of scope (which plausibly exist because major market participants will transact in multiple segments, and because of the ability to use common coding across different applications, to name just two factors). Second, a totally decentralized, open access, public blockchain has numerous disadvantages in large-scale commercial applications: central governance creates value.

Therefore, commercial blockchains will be “permissioned” in the lingo of the business. That is, unlike public blockchain, entry will be limited to privileged members and their customers. Moreover, the privileged members will govern and control the centralized entity. It will be a private club, not a public commons. (And note that even the Bitcoin blockchain is not ungoverned. Everyone is equal, but the big miners–and there are now a relatively small number of big miners–are more equal than others. The Iron Law of Oligarchy applies in blockchain too.)

Now add another factor: the natural monopoly blockchain will likely not be contestible, for reasons very similar to the ones I have written about for years to demonstrate why futures and equity exchanges are typically natural monopolies that earn large rents because they are largely immune from competitive entry. Once a particular blockchain gets critical mass, there will be the lock-in problem from hell: a coordinated movement of a large set of users from the incumbent to a competitor will be necessary for the entrant to achieve the scale necessary to compete. This is difficult, if not impossible to arrange. Three Finger Brown could count the number of times that has happened in futures trading on his bad hand.

Now do you understand why banks are so keen on the blockchain? Yes, they couch it in terms of improving transactional efficiency, and it does that. But it also presents the opportunity to create monopoly financial market infrastructures that are immune from competitive entry. The past 50 years have seen an erosion of bank dominance–“disintermediation”–that has also eroded their rents. Blockchain gives the empire a chance to strike back. A coalition of banks (and note that most blockchain initiatives are driven by a bank-led cooperative, sometimes in partnership with a technology provider or providers) can form a blockchain for a particular application or applications, exploit the centripetal force arising from network effects, and gain a natural monopoly largely immune from competitive entry. Great work if you can get it. And believe me, the banks are trying. Very hard.

Left to develop on its own, therefore, the blockchain ecosystem will evolve to look like the exchange ecosystem of the 19th or early-20th centuries. Monopoly coalitions of intermediaries–“clubs” or “cartels”–offering transactional services, with member governance, and with the members reaping economic rents.

Right now regulators are focused on the technology, and (like many others) seem to be smitten with the potential of the technology to reduce certain costs and risks. They really need to look ahead and consider the market structure implications of that technology. Just as the natural monopoly nature of exchanges eventually led to intense disputes over the distribution of the benefits that they created, which in turn led to regulation (after bitter political battles), the fundamental economics of blockchain are likely to result in similar conflicts.

The law and regulation of blockchain is likely to be complicated and controversial precisely because natural monopoly regulation is inherently complicated and controversial. The yin and yang of financial infrastructure in particular is that the technology likely makes monopoly efficient, but also creates the potential for the exercise of market power (and, I might add, the exercise of political power to support and sustain market power, and to influence the distribution of rents that result from that market power). Better to think about those things now when things are still developing, than when the monopolies are developed, operating, and entrenched–and can influence the political and regulatory process, as monopolies are wont to do.

The digital economy is driven by network effects: think Google, Facebook, Amazon, and even Twitter. In addition to creating new efficiencies, these dominant platforms create serious challenges for competition, as scholars like Ariel Ezrachi and Maurice Stucke have shown:

Peter Thiel, the successful venture capitalist, famously noted that ‘Competition Is for Losers.’ That useful phrase captures the essence of many technology markets. Markets in which the winner of the competitive process is able to cement its position and protect it. Using data-driven network effects, it can undermine new entry attempts. Using deep pockets and the nowcasting radar, the dominant firm can purchase disruptive innovators.

Our new economy enables the winners to capture much more of the welfare. They are able to affect downstream competition as well as upstream providers. Often, they can do so with limited resistance from governmental agencies, as power in the online economy is not always easily captured using traditional competition analysis. Digital personal assistants, as we explore, have the potential to strengthen the winner’s gatekeeper power.

Blockchain will do the exact same thing.

You’ve been warned.

*My understanding of these issues has benefited greatly from many conversations over the past year with Izabella Kaminska, who saw through the hype well before pretty much anyone. Any errors herein are of course mine.

Print Friendly

March 27, 2017

Seeing the OTC Derivatives Markets (and the Financial Markets) Like a State

Filed under: Clearing,Derivatives,Economics,Regulation — The Professor @ 12:07 pm

In the years since the financial crisis, and in particular the period preceding and immediately following the passage of Frankendodd, I can’t tell you how many times I saw diagrams that looked like this:

YellenCCPDiagram_1

YellenCCPDiagram_2

The top diagram is a schematic representation of an OTC derivatives market, with a tangle of bilateral connections between counterparties. The second is a picture of a hub-and-spoke trading network with a CCP serving as the hub. (These particular versions of this comparison are from a 2013 Janet Yellen speech.)

These diagrams came to mind when re-reading James Scott’s Seeing Like a State and his Two Cheers for Anarchism. Scott argues that states have an obsession with making the societies they rule over “legible” in order to make them easier to tax, regulate, and control. States are confounded by evolved complexity and emergent orders: such systems are difficult to comprehend, and what cannot be comprehended cannot be easily ruled. So states attempt to impose schemes to simplify such complex orders. Examples that Scott gives include standardization of language and suppression of dialects; standardization of land tenure, measurements, and property rights; cadastral censuses; population censuses; the imposition of familial names; and urban renewal (e.g., Hausmann’s/Napoleon III’s massive reconstruction of Paris). These things make a populace easier to tax, conscript, and control.

Complex realities of emergent orders are too difficult to map. So states conceive of a mental map that is legible to them, and then impose rules on society to force it to conform with this mental map.

Looking back at the debate over OTC markets generally, and clearing, centralized execution, and trade reporting in particular, it is clear that legislators and regulators (including central banks) found these markets to be illegible. Figures like the first one–which are themselves a greatly simplified representation of OTC reality–were bewildering and disturbing to them. The second figure was much more comprehensible, and much more comforting: not just because they could comprehend it better, but because it gave them the sense that they could impose an order that would be easier to monitor and control. The emergent order was frightening in its wildness: the sense of imposing order and control was deeply comforting.

But as Scott notes, attempts to impose control on emergent orders (which in Scott’s books include both social and natural orders, e.g., forests) themselves carry great risks because although hard to comprehend, these orders evolved the way they did for a reason, and the parts interact in poorly understood–and sometimes completely not understood–ways. Attempts to make reality fit a simple mental map can cause the system to react in unpredicted and unpredictable ways, many of which are perverse.

My criticism of the attempts to “reform” OTC markets was largely predicated on my view that the regulators’ simple mental maps did great violence to complex reality. Even though these “reform” efforts were framed as ways of reducing systemic risk, they were fatally flawed because they were profoundly unsystemic in their understanding of the financial system. My critique focused specifically on the confident assertions based on the diagrams presented above. By focusing only on the OTC derivatives market, and ignoring the myriad connections of this market to other parts of the financial market, regulators could not have possibly comprehended the systemic implications of what they were doing. Indeed, even the portrayal of the OTC market alone was comically simplistic. The fallacy of composition played a role here too: the regulators thought they could reform the system piece-by-piece, without thinking seriously about how these pieces interacted in non-linear ways.

The regulators were guilty of the hubris illustrated beautifully by the parable of Chesterton’s Fence:

In the matter of reforming things, as distinct from deforming them, there is one plain and simple principle; a principle which will probably be called a paradox. There exists in such a case a certain institution or law; let us say, for the sake of simplicity, a fence or gate erected across a road. The more modern type of reformer goes gaily up to it and says, “I don’t see the use of this; let us clear it away.” To which the more intelligent type of reformer will do well to answer: “If you don’t see the use of it, I certainly won’t let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to destroy it.

In other words, the regulators should have understood the system and why it evolved the way that it did before leaping in to “reform” it. As Chesterton says, such attempts at reformation quite frequently result in deformation.

Somewhat belatedly, there are efforts underway to map the financial system more accurately. The work of Richard Bookstaber and various colleagues under the auspices of the Office of Financial Research to create multilayer maps of the financial system is certainly a vast improvement on the childish stick figure depictions of Janet Yellen, Gary Gensler, Timmy Geithner, Chris Dodd, Barney Frank et al. But even these more sophisticated maps are extreme abstractions, not least because they cannot capture incentives, the distribution of information among myriad market participants, and the motivations and behaviors of these participants. Think of embedding these maps in the most complicated extensive form large-N player game you can imagine, and you might have some inkling of how inadequate any schematic representation of the financial system is likely to be. When you combine this with the fact that in complex systems, even slight changes in initial conditions can result in completely different outcomes, the futility of “seeing like a state” in this context becomes apparent. The map of initial conditions is inevitably crude, making it an unreliable guide to understanding the system’s future behavior.

In my view, Scott goes too far. There is no doubt that some state-driven standardization has dramatically reduced transactions costs and opened up new possibilities for wealth-enhancing exchanges (at some cost, yes, but these costs are almost certainly less than the benefit), but Scott looks askance at virtually all such interventions. Thus, I do not exclude the possibility of true reform. But Scott’s warning about the dangers of forcing complex emergent orders to conform to simplified, “legible”, mental constructs must be taken seriously, and should inform any attempt to intervene in something like the financial system. Alas, this did not happen when legislators and regulators embarked on their crusade to reorganize wholesale the world financial system. It is frightening indeed to contemplate that this crusade was guided by such crude mental maps such as those supposedly illustrating the virtues of moving from an emergent bilateral OTC market to a tamed hub-and-spoke cleared one.

PS. I was very disappointed by this presentation by James Scott. He comes off as a doctrinaire leftist anthropologist (but I repeat myself), which is definitely not the case in his books. Indeed, the juxtaposition of Chesterton and Scott shows how deeply conservative Scott is (in the literal sense of the word).

Print Friendly

March 24, 2017

Creative Destruction and Industry Life Cycles, HFT Edition

Filed under: Derivatives,Economics,Exchanges,Regulation — The Professor @ 11:56 am

No worries, folks: I’m not dead! Just a little hiatus while in Geneva for my annual teaching gig at UniversitĂŠ de Genève, followed by a side trip for a seminar (to be released as a webinar) at ESSEC. The world didn’t collapse without my close attention, but at times it looked like a close run thing. But then again, I was restricted to watching CNN so my perception may be a little bit warped. Well, not a little bit: I have to say that I knew CNN was bad, but I didn’t know how bad until I watched a bit while on the road. Appalling doesn’t even come close to describing it. Strident, tendentious, unrelentingly biased, snide. I switched over to RT to get more reasonable coverage. Yes. It was that bad.

There are so many allegations regarding surveillance swirling about that only fools would rush in to comment on that now. I’ll be an angel for once in the hope that some actual verifiable facts come out.

So for my return, I’ll just comment on a set of HFT-related stories that came out during my trip. One is Alex Osipovich’s story on HFT traders falling on hard times. Another is that Virtu is bidding for KCG. A third one is that Quantlabs (a Houston outfit) is buying one-time HFT high flyer Teza. And finally, one that pre-dates my trip, but fits the theme: Thomas Peterffy’s Interactive Brokers Group is exiting options market making.

Alex’s story repeats Tabb Group data documenting a roughly 85 percent drop in HFT revenues in US equity trading. The Virtu-KCG proposed tie-up and the Quantlabs-Teza consummated one are indications of consolidation that is typical of maturing industries, and a shift it the business model of these firms. The Quantlabs-Teza story is particularly interesting. It suggests that it is no longer possible (or at least remunerative) to get a competitive edge via speed alone. Instead, the focus is shifting to extracting information from the vast flow of data generated in modern markets. Speed will matter here–he who analyzes faster, all else equal, will have an edge. But the margin for innovation will shift from hardware to data analytics software (presumably paired with specialized hardware optimized to use it).

None of these developments is surprising. They are part of the natural life cycle of a new industry. Indeed, I discussed this over two years ago:

In fact, HFT has followed the trajectory of any technological innovation in a highly competitive environment. At its inception, it was a dramatically innovative way of performing longstanding functions undertaken by intermediaries in financial markets: market making and arbitrage. It did so much more efficiently than incumbents did, and so rapidly it displaced the old-style intermediaries. During this transitional period, the first-movers earned supernormal profits because of cost and speed advantages over the old school intermediaries. HFT market share expanded dramatically, and the profits attracted expansion in the capital and capacity of the first-movers, and the entry of new firms. And as day follows night, this entry of new HFT capacity and the intensification of competition dissipated these profits. This is basic economics in action.

. . . .

Whether it is by the entry of a new destructively creative technology, or the inexorable forces of entry and expansion in a technologically static setting, one expects profits earned by firms in one wave of creative destruction to decline.  That’s what we’re seeing in HFT.  It was definitely a disruptive technology that reaped substantial profits at the time of its introduction, but those profits are eroding.

That shouldn’t be a surprise.  But it no doubt is to many of those who have made apocalyptic predictions about the machines taking over the earth.  Or the markets, anyways.

Or, as Herb Stein famously said as a caution against extrapolating from current trends, “If something cannot go on forever, it will stop.” Those making dire predictions about HFT were largely extrapolating from the events of 2008-2010, and ignored the natural economic forces that constrain growth and dissipate profits. HFT is now a normal, competitive business earning normal, competitive profits.  And hopefully this reality will eventually sink in, and the hysteria surrounding HFT will fade away just as its profits did.

The rise and fall of Peterffy/Interactive illustrates Schumpeterian creative destruction in action. Interactive was part of a wave of innovation that displaced the floor. Now it can’t compete against HFT. And as the other articles show, HFT is in the maturation stage during which profits are competed away (ironically, a phenomenon that was central to Marx’s analysis, and which Schumpeter’s theory was specifically intended to address).

This reminds me of a set of conversations I had with a very prominent trader. In the 1990s he said he was glad to see that the markets were becoming computerized because he was “tired of being fucked by the floor.” About 10 years later, he lamented to me how he was being “fucked by HFT.” Now HFT is an industry earning “normal” profits (in the economics lexicon) due to intensifying competition and technological maturation: the fuckers are fucking each other now, I guess.

One interesting public policy issue in the Peterffy story is the role played by internalization of order flow in undermining the economics of Interactive: there is also an internalization angle to the Virtu-KCG story, because one reason for Virtu to buy KCG is to obtain the latter’s juicy retail order flow. I’ve been writing about this (and related) subjects for going on 20 years, and it’s complicated.

Internalization (and other trading in non-lit/exchange venues) reduces liquidity on exchanges, which raises trading costs there and reduces the informativeness of prices. Those factors are usually cited as criticism of off-exchange execution, but there are other considerations. Retail order flow (likely uninformed) gets executed more cheaply, as it should because it it less costly (due to the fact that it poses less of an adverse selection risk). (Who benefits from this cheaper execution is a matter of controversy.) Furthermore, as I pointed out in a 2002 Journal of Law, Economics and Organization paper, off-exchange venues provide competition for exchanges that often have market power (though this is less likely to be the case in post-RegNMS which made inter-exchange competition much more intense). Finally, some (and arguably a lot of) informed trading is rent seeking: by reducing the ability of informed traders to extract rents from uninformed traders, internalization (and dark markets) reduce the incentives to invest excessively in information collection (an incentive Hirshleifer the Elder noted in the 1970s).

Securities and derivatives market structure is fascinating, and it presents many interesting analytical challenges. But these markets, and the firms that operate in them, are not immune to the basic forces of innovation, imitation, and entry that economists have understood for a long time (but which too many have forgotten, alas). We are seeing those forces at work in real time, and the fates of firms like Interactive and Teza, and the HFT sector overall, are living illustrations.

 

Print Friendly

March 10, 2017

US Shale Puts the Saudis and OPEC in Zugzwang

Filed under: Commodities,Derivatives,Economics,Energy,Politics — The Professor @ 2:55 pm

This was CERA Week in Houston, and the Saudis and OPEC provided the comedic entertainment for the assembled oil industry luminaries.

It is quite evident that the speed and intensity of the U-turn in US oil production has unsettled the Saudis, and they don’t know quite what to do about it. So they were left with making empty threats.

My favorite was when Saudi Energy Minister Khalid al-Falih said there would be no “free rides” for US shale producers (and non-OPEC producers generally). Further, he said OPEC “will not bear the burden of free riders,” and “[w]e can’t do what we did in the ’80s and ’90s by swinging millions of barrels in response to market condition.”

Um, what is OPEC going to do about US free riders? Bomb the Permian? If it cuts output, and prices rise as a result, US E&P activity will pick up, and damn quick. The resulting replacement of a good deal of the OPEC output cut will limit the price impact thereof. The best place to be is outside a cartel that cuts output: you can get the benefit of the higher prices, and produce to the max. That’s what is happening in the US right now. OPEC has no credible way of showing off, or threatening to show off, free riders.

As for not doing what they did in the ’80s, well that’s exactly OPEC’s problem. It’s not the ’80s anymore. Now if it tries to “swing millions of barrels” to raise price, there is a fairly elastic and rapidly responding source of supply that can replace a large fraction of those barrels, thereby limiting the price impact of the OPEC swingers, baby.

Falih’s advisers were also trying to scare the US producers. Or something:

“One of the advisors said that OPEC would not take the hit for the rise in U.S. shale production,” a U.S. executive who was at the meeting told Reuters. “He said we and other shale producers should not automatically assume OPEC will extend the cuts.”

Presumably they are threatening a return to their predatory pricing strategy (euphemistically referred to as “defending market share”) that worked out so well for them the last time. Or perhaps it is just a concession that US supply is so elastic that it makes the demand for OPEC oil so elastic that output cuts are a losing proposition and will not endure. Either way, it means that OPEC is coming to the realization that continuing output cuts are unlikely to work. Meaning they won’t happen.

OPEC also floated cooperation with US producers on output. Mr. al-Falih, meet Senator Sherman! And if the antitrust laws didn’t make US participation in an agreement a non-starter, it would be almost impossible to cartelize the US industry given the largely free entry into E&P and the fungibility of technology, human capital, land, services, and labor. Maybe OPEC should hold talks with the Texas Railroad Commission instead.

Finally, in another laugh riot, OPEC canoodled with hedge funds. Apparently under the delusion that financial players play a material role in setting the price of physical barrels, rather than the price of risk. Disabling speculation could materially help OPEC only by raising the cost of hedging, which would tend to raise the costs of E&P firms, especially the more financially stretched ones. (Along these lines, I would argue that the big increase in net long speculative positions in recent months is not due to speculators pushing themselves into the market, but instead they have been pulled into the market by increased hedging activity that has occurred due to the increase in drilling activity in the US.)

Oil prices were down hard this week, from a $53 handle to a (at the time of this writing) $49.50 price. The first down-leg was due to the surprise spike in US inventories, but the continued weakness could well reflect the OPEC and Saudi messaging at CERA Week. The pathetic performance signaled deep strategic weakness, and suggests that the Saudis et al realize they are in zugzwang: regardless of what they do with regards to output, they are going to regret doing it.

My heart bleeds. Bleeds, I tells ya!

 

Print Friendly

March 3, 2017

The Rocks Didn’t Go Anywhere. Go Figure.

Filed under: Commodities,Economics,Energy,Russia — The Professor @ 2:58 pm

The conventional wisdom during the oil price collapse that started in mid-2014 and which accelerated starting in November of that year when the Saudis decided not to cut output was that the Kingdom was engaged in a predatory pricing strategy intended to drive out US shale producers. I mocked this in real time. Nothing really special about that analysis: economists have known for a long time that predatory strategies are almost never rational. They are irrational because the predator has to incur losses to cause its competitors to reduce production, but the competitors’ resources are unlikely to leave the industry permanently: they can come flooding back in when the predator attempts to restrict output to raise prices. Thus, the predator suffers all the pain at selling at low prices, but cannot recoup these losses by selling at higher prices later.

In the case of shale, the rocks weren’t going anywhere. Obviously. When prices fell, companies just drilled fewer wells–a lot fewer wells–but the rocks remained. The knowledge of where the right rocks were remained too. The knowledge of how to drill the rocks didn’t disappear. Idled rigs went into storage. Yes, some labor (including some skilled labor left), but this resource is pretty flexible and can come back quickly when demand goes up. E&P companies incurred financial losses, and some experienced financial distress and even bankruptcy, but this did not drive them out of the industry permanently, and did not drive out the human and physical capital that these firms employed. New capital required to drill new wells is available to E&P firms based on future prospects, not past failures. Indeed, one of the functions of bankruptcy and restructuring of distressed firms is to clean up balance sheets so that old debt doesn’t impede the ability of firms to take on positive NPV projects.

In sum, even though drilling activity plummeted along with prices, the resources needed to ramp up production weren’t destroyed or driven out of the industry. They were only waiting for more favorable prices. The industry went into hibernation: it didn’t die.

OPEC’s decision to cut output to raise prices–and the Saudis going beyond their share of output cuts to strengthen OPEC’s effect–provided the opportunity the industry had been waiting for. It rapidly awoke from its slumbers. Rig counts did a U-turn, up 90 percent in 9 months. And so has output. John Kemp reports:

U.S. crude oil production appears to be rising strongly thanks to increased shale drilling as well as rising offshore output from the Gulf of Mexico.

Production averaged almost 9 million barrels per day (bpd) in the four weeks to Feb. 24, according to the latest weekly estimates published by the Energy Information Administration.

Production has been on an upward trend since hitting a cyclical low of 8.5 million bpd in September (“Weekly Petroleum Status Report”, EIA, March 1).

Javier Blas chimes in:

“North American oil companies are going to increase their spending by 25 percent in 2017 compared to last year,” said Daniel Yergin, the oil historian-cum-consultant who hosts the CERAWeek. “The increase reflects the magnetism of U.S. shale.”

U.S. benchmark West Texas Intermediate traded at $52.79 a barrel on Friday. Futures bounced between $51.22 and $54.94 in February.

So far this year, U.S. energy companies have raised $10.5 billion in fresh equity, with shale and oil service groups drawing the most investment, the best start of the year since at least 1999 and equal to a third of what the sector raised in the whole of 2015. [A clear indication that “debt overhang” is not constraining the ability to access capital to fund drilling programs, which would have been the only way the Saudi strategy had a prayer of working.]

In Midland, the Texas city at the center of the Permian basin, the activity rush is palpable, as is the threat of higher costs for shale companies. The county’s active-rig total ranks second in the U.S., behind only Reeves County further to the west.

“You could see the town’s energy is back,” said Alan Means, founder of Cambrian Management Ltd., a Midland-based firm that operates more than 200 oil wells in the Permian across Texas and New Mexico. “The rigs are up again, the fracking crews are busier and the highway traffic is increasing.”

As activity rises, the man-camps in the town outskirts are flush again, with workers arriving from the Bakken in Montana and North Dakota, and from as far way as Canada. The 1,000-bed Permian Lodging camp is now 100 percent full, up from 65 percent in July, according to camp owner Ralph McIngvale. [See how quickly labor resources can return?]

Shale firms have also become more efficient.

In sum, the predatory strategy hasn’t made shale go away. Now, the longer the Saudis and the rest of OPEC (and the non-OPEC countries that have joined in) hold down output, the larger the fraction of that output loss will be redeemed by resurgent shale production in the US.

In other words, shale makes the the demand for OPEC (and non-OPEC cooperators’) oil pretty elastic. This raises serious questions about the rationality of the output cuts from the perspective of the cutters, especially the big countries like Saudi Arabia (which has cut substantially–more than it promised) and Russia (whose cooperation is more equivocal). This, in turn, makes the durability of the cuts problematic.

The quick turnaround in US shale provides a new data point for the Saudis, Russians, et al. Their dreams that they could make rocks disappear–or that they could make it permanently unattractive to extract oil from their rocks–have proved chimerical. Persisting in output cuts will become progressively less profitable, and indeed, is likely to be downright unprofitable soon. What’s the over-under on how long until they figure that rocks will outlast them, and give up the output cut game?

Teaser: I am currently slogging through oil well data (tens of thousands of wells in all the major basins) in a study of the sources of productivity gains in shale production. Hopefully I will be able to report some results soon. Initial results are particularly ominous for OPEC. I am finding evidence of learning-by-doing in both oil and gas. That is, drilling wells today generates knowledge that enhances future productivity and lowers future costs. This means that the increased shale output resulting from OPEC’s current attempt to prop up prices will increase the US shale industry’s future productivity, making it even harder for OPEC to keep prices high months or years from now.

Print Friendly

March 1, 2017

Ivan Glasenberg: Mistaking Luck for Genius?

Filed under: China,Commodities,Economics,Energy — The Professor @ 8:58 pm

Glencore is back from the brink, posting a $1.4 billion profit for 2016. When I first read about the 2016 results, I wondered aloud to a friend whether Ivan Glasenberg would have learned something from the company’s near death experience, or instead would consider the fall someone else’s fault, and the resurrection the result of his genius. I should have known it would be the latter.

Glasenberg has been gloating about the 2016 results, and flaunting them as some sort of vindication. He is openly musing about paying a $20 billion dividend to the company’s “long suffering shareholders,” and is looking for acquisitions, including in North American grain trading.

The fact is that Glencore and Ivan Glasenberg were (and are) just along for a ride on the commodity price roller coaster, which is located at a Chinese amusement park. When the roller coaster plunged as the Chinese economy shuddered in 2015, Glencore plunged along with it. Now, in large part due to Chinese policy moves that have caused the prices of coal and other raw materials to climb again, Glencore has rebounded. Management genius had nothing to do with it.

Well, that’s not completely true. Glasenberg made the conscious choice to transform Glencore from a trading firm that was basically flat price neutral to a mining firm with a big exposure to the flat prices of coal and copper in particular. So the big drop and the rebound are the result of his choice.

When Glencore was in peril in 2015, I said that its fate was dependent on commodity prices, and hence on Chinese policy, rather than any decision that management can make. I said that Glencore was along for the ride. That turned out to be true. It remains true going forward. That was the fundamental strategic choice that has shaped and will continue to shape its performance. Management can at best optimize performance over the cycle, but the cycle will dominate.

Prior to 2015, Glencore management did not optimize. The firm was over-leveraged: it continued to operate with trading-firm like leverage levels even though it faced bigger commodity price risks. Glasenberg/Glencore have cut down on debt in the past year, and this reduces the likelihood of a repeat of 2015–if they stick to a lower leverage policy going forward. But the fact is that the biggest driver of Glencore’s fate is not decisions made in Baar, but the whims of policymakers in Beijing.

It is interesting to compare Glasenberg’s crowing to the more muted tones of other mining firms which have also profited from the rebound. The managements of these other firms apparently realize that what the cycle giveth, the cycle can taketh away. Is Peabody Coal’s management preening over the company’s rebound? No. They are silently grateful that factors outside of their control have turned their way. Similarly, Noble eked out a profit, but its management isn’t breaking their arms patting themselves on the back.

Traders typically make deals of relatively short duration, and it is possible to evaluate trading decisions and trading acumen based on P/L. But by transforming Glencore into a mining company with a  supersized trading arm, Glasenberg purposefully made a very long term trade with a duration of years (decades, even): quarterly or even annual fluctuations in P/L tell you little about the wisdom of such a trade. It is therefore rather disturbing to watch Glasenberg gloat on the basis of a profitable year driven by a cyclical turn with which he had exactly zero to do with.

And let’s put this in perspective. Glencore lost $5 billion in 2015. 2016 made up less than 30 percent of that loss. There is still a long way to go to determine whether the big, multi-year trade that Glencore made a few years ago was a smart play or not.

Perhaps Glasenberg still has a trader’s mindset, and a trader’s time horizon, suited for a transaction cycle measured in weeks or months, not years or decades. If so, the company might be in for a big future fall, because its guiding light is apt to mistake luck for skill.

Print Friendly

February 26, 2017

If You Want Blood, You Got It–Tesla Redux

Filed under: Climate Change,Economics — The Professor @ 3:24 pm

When Musk announced his plans to merge Tesla and Solar City, I remarked that “Tesla bleeds cash like a Game of Thrones battle scene.” Elon (who long ago blocked me on Twitter, BTW) apparently recognized this. In an August 29, 2016 email to Tesla employees Musk emphasized how important it was for the company to report a positive cash flow for 3Q16:

I thought it was important to write you a note directly to let you know how critical this quarter is. The third quarter will be our last chance to show investors that Tesla can be at least slightly positive cash flow and profitable before the Model 3 reaches full production. Once we get to Q4, Model 3 capital expenditures force us into a negative position until Model 3 reaches full production. That won’t be until late next year.

. . . .

Even more important, we will need to raise additional cash in Q4 to complete the Model 3 vehicle factory and the Gigafactory. The simple reality of it is that we will be in a far better position to convince potential investors to bet on us if the headline is not “Tesla Loses Money Again”, but rather “Tesla Defies All Expectations and Achieves Profitability”. That would be amazing!

Were you amazed(!) that Tesla eked out a positive cash flow in the third quarter? If so, do you feel like a fool now that the 4Q16 results are out, showing that the blood is gushing again? For in the quarter, Tesla set a record (and not the good kind!) for free cash flow: a cool $1 billion to the negative, -$447 in operating cash flow and $522 in capex. The operating number reflected lower vehicle emissions credits, illustrating the company’s dependence on this source of revenue.

So what?, you say. Elon said that “Once we get to Q4, Model 3 capital expenditures” will make results look bad. But it appears that Telsa actually held back on capex. In the vaunted 3Q guidance, the company implied that it would spend $1 billion in capex in 4Q16: it barely spent half of that. This does not bode well for delivering the Model 3 on time, and demonstrates the dilemma that Musk faces.

Given Musk’s emphasis on delivering a positive cash flow number in the third quarter, it appears that his accountants rose to the task. There raises serious questions about the legitimacy of the third quarter number. It was obviously a one-off. Elon said that it was vital to “convince potential investors to bet on us” by “defying expectations.”  Was it necessary to lie to defy?

Any such suspicions should be strengthened by the, well, suspicious resignation of Tesla’s CFO on the day its 8-K was filed, to be effective when its 10-K is filed.  The reason given is rather odd: Wheeler plans to “pursue opportunities in public policy.” Well, I guess it’s better than “I want to spend more time with my family.”

The resignation of a CFO is never a good sign, especially when it coincides with the release of an ugly earnings report that follows an earnings report that appeared to be too good to be true at the time–and which looks even more too good to be true in retrospect.

Even Elon appears a little anxious. He said that the company’s cash position is “very close to the edge.” So get ready to have your stock watered again, boys and girls: “So we’re considering a number of options but I think it probably makes sense to raise capital to reduce risk.”

Or, to mix metaphors: another transfusion for the bleeder. In the vein, out the artery. Investors and Wall Street have been very forgiving. For years. How long can that continue?

Print Friendly

February 25, 2017

Should Social Media Be Regulated as Common Carriers?

Filed under: Economics,Politics,Regulation — The Professor @ 6:43 pm

Major social media, notably Twitter and Facebook, are gradually moving to censor what is communicated on them. In Twitter’s case, the primary stated rationale is to “protect its users from abuse and harassment.” It has also taken upon itself to  “[identify] and [collapse] potentially abusive and low-quality replies so the most relevant conversations are brought forward.” There are widespread reports that Twitter engages in “shadowbanning”, i.e., hiding the Tweets of those users it identifies as objectionable, and making these Tweets inaccessible in searches.

Further, there are suspicions that there is a political and ideological component to the filters that Twitter applies, with conservative (and especially alt-right) content and users being more likely to fall afoul of these restrictions: the relentlessly leftist tilt of CEO Jack Dorsey (and most of its employees) gives considerable credence to these suspicions.

For its part, Facebook is pursuing ways to constrain users from posting what it deems as “misinformation” (aka “fake news”). This includes various measures such as cooperating with “third party fact-checking organizations“. Given the clear leftist tilt of Mark Zuckerberg and Facebook’s workforce, and the almost laughably leftist slant of the “fact-checkers”, there is also considerable reason for concern that the restrictions will not be imposed in a politically neutral way.

The off-the-top classical liberal/libertarian response to this is likely to be “well, this is unfortunate, but these are private corporations, and they can do what they want with their property.” But however superficially plausible this position appears to be, in fact there is a principled classical liberal/libertarian response that arrives at a very different conclusion. In particular, as arch-libertarian Richard Epstein (who styles himself as The Libertarian in his Hoover Institute podcast) has consistently pointed out, even during the heyday of small government, classical liberal government and law, the common law recognized that restrictions on the autonomy of certain entities was not only justifiable, but desirable. In particular, natural monopolies and near-monopolies were deemed to be “common carriers” upon whom the law imposed a duty of providing access on a non-discriminatory basis. The (classically liberal) common law of that era recognized that such entities could exercise market power, or engage in discriminatory conduct without fear of competitive check. Thus, the obligation to serve all on a non-discriminatory basis in order to constrain the exercise of market power, or invidious discrimination based on the preferences of the owner of the common carrier.

Major social media (and Google as well–perhaps most of all) clearly have market power, and the ability to discriminate without fear of losing business to competitors. The network nature of social media (and search engines) leads to the dominance of a small number of platforms, or even one platform. Yes, there are competitors to Facebook, Twitter, and Google, but these companies are clearly dominant in their spaces, and network effects make them largely immune to competitive entry. Imposition of a common carrier-inspired obligation to provide non-discriminatory access is therefore quite reasonable, and has a substantial economic and legal foundation. Thus, libertarians and classical liberals and conservatives and even fringe voices should not resign themselves to being second or third class citizens on social media, merely because these are private entities, rather than government ones. (Indeed, the analogy should go the other direction. A major reason for limiting the ability of the government to control speech is because of its monopoly of legal violence. It is monopoly power, regardless of whether in a market or political setting, that needs to be constrained through things like rights to free speech, or non-discriminatory access to common carriers.)

Further, insofar as leftists (including the managements of the major social media companies) are concerned, it is utterly incoherent for them to assert that as private entities they are perfectly free to restrict access according to their whims, given that leftists also adamantly (indeed, obnoxiously) insist that anti-discrimination laws should be imposed on small entities operating in highly competitive environments. Specifically, leftists believe that bakers or caterers or pizzarias with zero market power should be required to serve all, even if they have religious (or other) objections to doing so. But a baker refusing to sell a wedding cake to a gay couple does not meaningfully deprive said couple of the opportunity to get a cake: there are many other bakeries, and given the trivial costs of entry even if most incumbent bakers don’t want to serve gays, this only provides a commercial opportunity for entrant bakers to cater to the excluded clientele. Thus, discrimination by Baker A does not impose large costs on those s/he would prefer not to serve (even though forcing A to serve them might impose high costs on A, due to his/her sincere religious beliefs).

The same cannot be said of Twitter or Facebook. Given the nature of networks, social and otherwise, entrants or existing competitors are very poor substitutes for the dominant firms, which gives them the power to exclude, and which makes their exercise of this power extremely costly to the excluded.  In other words, if one believes that firms in highly competitive markets should be obligated to provide service/access to all on a non-discriminatory basis, one must concede that the Twitters, Facebooks, and Googles of the world should be similarly obligated, and that given their market power their conduct should be subject to a substantially higher degree of scrutiny than a small firm in a competitive market.

Of course, it is one thing to impose de jure an obligation on Twitter et al to provide equal access and equal treatment to all, regardless of political beliefs, and quite another to enforce it de facto. Of course Jack and Mark or Sergey don’t say “we discriminate against those holding contrary political opinions.” No, they couch their actions in terms of “protecting against abusive behavior and hate speech” or “stamping out disinformation.” But they retain the discretion to interpret what is abusive, hateful, and false–and it is clear that they consider much mainstream non-leftist belief as beyond the pale. Hence, enforcement of an open non-discriminatory access obligation would be difficult, and would inevitably involve estimation of discriminatory outcomes using statistical measures, a fraught exercise (as employment discrimination law demonstrates). Given the very deep pockets that these firms have, moreover, prevailing in a legal battle would be very difficult.

But this is a practical obstacle to treating social media like common carriers with a duty to provide non-discriminatory access. It is not a reason for classical liberals and libertarians to concede to dominant social network operators that they have an unrestricted right to restrict access as a matter of principle. In fact, the classical liberal/libertarian principle cuts quite the other way. And at the very least, imposing a common carrier-like obligation would substantially raise the cost that social network operators would pay to indulge in discrimination based on politics, beliefs, or ideology, and this could go a long way to make these places safe for the expression of political opinions that drive Jack, Mark, et al, nuts.

 

Print Friendly

February 20, 2017

Trolling Brent

Filed under: Commodities,Derivatives,Economics,Energy,Regulation — The Professor @ 10:14 am

Platts has announced the first major change in the Brent crude assessment process in a decade, adding Troll crude to the “Brent” stream:

A decline in supply from North Sea fields has led to concerns that physical volumes could become too thin and hence at times could be accumulated in the hands of just a few players, making the benchmark vulnerable to manipulation.

Platts said on Monday it would add Norway’s Troll crude to the four British and Norwegian crudes it already uses to assess dated Brent from Jan 1. 2018. This will join Brent, Forties, Oseberg and Ekofisk, or BFOE as they are known.

This is likely a stopgap measure, and Platts is considering more radical moves in the future:

It is also investigating a more radical plan to account for a possible larger drop-off in North Sea output over the next decade that would allow oil delivered from as far afield as west Africa and Central Asia to contribute to setting North Sea prices.

But the move is controversial, as this from the FT article shows:

If this is not addressed first, one source at a big North Sea trader said, the introduction of another grade to BFOE could make “an assessment that is unhedgeable, hence not fit for purpose”. “We don’t see any urgency to add grades today,” he added. Changes to Brent shifts the balance of power in North Sea trading. The addition of Troll makes Statoil the biggest contributor of supplies to the grades supporting Brent, overtaking Shell. Some big North Sea traders had expressed concern Statoil would have an advantage in understanding the balance of supply and demand in the region as it sends a large amount of Troll crude to its Mongstad refinery, Norway’s largest.

The statement about “an assessment that is unhedgeable, hence not fit for purpose” is BS, and exactly the kind of thing one always hears when contracts are redesigned. The fact is that contract redesigns have distributive effects, even if they improve a contract’s functioning, and the losers always whinge. Part of the distributive effect relates to issues like giving a company like Statoil an edge . . . that previously Shell and the other big North Sea producers had. But part of the distributive effect is that a contract with inadequate deliverable supply is a playground for big traders, who can more easily corner, squeeze, and hug such a contract.

Insofar as hedging is concerned, the main issue is how well the Brent contract performs as a hedge (and a pricing benchmark) for out-of-position (i.e., non-North Sea) crude, which represents the main use of Brent paper trades. Reducing deliverable supply constraints which contribute to pricing anomalies (and notably, anomalous moves in the basis) unambiguously improves the functioning of the contract for out-of-position players. Yeah, those hedging BFOE get slightly worse hedging performance, but that is a trivial consideration given that the very reason for changing the benchmark is the decline in BFOE production–which now represents less than 1 percent of world output. Why should the hair on the end of the tail wag the dog?

Insofar as the competition with WTI is concerned, the combination of larger US supplies, the construction of pipelines to move supplies from the Midcon (PADDII) to the Gulf (PADDIII)  and the lifting of the export ban have restored and in fact strengthened the connection of WTI prices to seaborne crude prices. US barrels are now going to both Europe and Asia, and US crude has effectively become the marginal barrel in most major markets, meaning that it is determining price and that WTI is an effective hedge (especially for the lighter grades). And by the way, the WTI delivery mechanism is much more robust and transparent than the baroque (and at times broken) Brent pricing mechanism.

As if to add an exclamation point to the story, Bloomberg reports that in recent months Shell has been bigfooting–or would that be trolling?–the market with big trades that have arguably distorted spreads. It got to the point that even firms like Vitol (which are notoriously loath to call foul, lest someone point fingers at them) raised the issue with Shell:

While none of those interviewed said Shell did anything illegal, they said the company violated the unspoken rules governing the market, which is lightly regulated. Executives of several trading rivals, including Vitol Group BV, the world’s top independent oil merchant, raised objections with counterparts at Shell last year, according to market participants.

What are the odds that Mr. Fit for Purpose is a Shell trader?

All of this is as I predicted, almost six years ago, when everyone was shoveling dirt on WTI and declaring Brent the Benchmark of the Forever Future:

Which means that those who are crowing about Brent today, and heaping scorn on WTI, will be begging for WTI’s problems in a few years.  For by then, WTI’s issues will be fixed, and it will be sitting astride a robust flow of oil tightly interconnected with the nexus of world oil trading.  But the Brent contract will be an inverted paper pyramid, resting on a thinner and thinner point of crude production.  There will be gains from trade–large ones–from redesigning the contract, but the difficulties of negotiating an agreement among numerous big players will prove nigh on to impossible to surmount.  Moreover, there will be no single regulator in a single jurisdiction that can bang heads together (for yes, that is needed sometimes) and cajole the parties toward agreement.

So Brent boosters, enjoy your laugh while it lasts.  It won’t last long, and remember, he who laughs last laughs best.

That’s exactly how things have worked out, even down to the point about the difficulties of getting the big boys to play together (a lesson gained through extensive personal experience, some of which is detailed in the post). Just call me Craignac the Magnificent. At least when it comes to commodity contract design 😉

Print Friendly

« Previous PageNext Page »

Powered by WordPress