Streetwise Professor

April 24, 2021

Why Is Proof of Efficacy Required for Pharmaceutical Interventions, But NOT Non-Pharmaceutical Ones?

Filed under: China,CoronaCrisis,Economics,Politics,Regulation — cpirrong @ 11:43 am

Under Federal law, a pharmaceutical intervention must be proven safe and effective before it is marketed to the public. If after introduction it proves unsafe or ineffective, the Food and Drug Administration can rescind its approval.

Note the burden of proof: the manufacturer must prove safety and efficacy. Safety and efficacy are not rebuttable presumptions.

Would the same be true of non-pharmaceutical interventions (NPIs). This neologism (neoanacronym?) is used to describe the policies that have been imposed during the Covid Era–most particularly, lockdowns and masks.

Neither had been proven safe or effective prior to their wholesale–and I daresay, indiscriminate–use. Lockdowns in particular had never been subjected to any clinical experiment or trial. Indeed, the idea had been evaluated by epidemiologists and others, and soundly rejected. But a policy first introduced in a police state–China–spread just as rapidly as the virus to supposedly non-police states despite it never having been proven efficacious or safe.

A year’s experience has produced the evidence. Greetings, fellow lab rats!

And the evidence shows decisively that lockdowns are NOT effective at affecting any medically meaningful metric about Covid. This American Institute of Economic Research piece provides an overview of the evidence through December: subsequent studies have provided additional evidence.

Furthermore, lockdowns have been proven to be unsafe. Unsafe to incomes, especially for those whose jobs do not permit working from home. Unsafe for physical health, in the form of inter alia deferred cancer diagnoses and treatment for heart attacks and strokes and greater substance abuse (with higher incidence of overdoses), as well as delayed “elective” surgeries that improve life quality. Unsafe for mental health. Unsafe for children, in particular, who have experienced debilitating social isolation and profound disruption in their educations. (Although given the trajectory of American public education, especially post-George Floyd/Derek Chauvin, feral children might be better off than those subjected to the tortures of a CRT-infused curriculum and CRTKoolAid drinking “educators.”)

Masks are not as devastating as lockdowns, but they have also been shown to be ineffective and also unsafe, especially for those who must wear them for extended stretches–which includes in particular children at school.

(Remember “For the children”? Ah, good times. Good times.)

Drug regulation was one of the first major initiatives of the Progressive Era, and the 1962 FDA Amendments that imposed the efficacy requirement were also driven by progressives. My assessment of the economic evidence (especially the literature spawned by my thesis advisor, the great Sam Peltzman) is that the efficacy requirement in particular has been harmful, on net, because it delayed and in some cases prevented the introduction of beneficial therapies.

But even if–especially if–you accept the progressive-inspired conventional wisdom regarding pharmaceutical intervention regulation, you should be dismayed and even furious that the same logic that has NOT been applied to NPIs. The underlying principle of drug regulation has been “show me”: show me something works. The underlying principle of Covid Era ukases has been: “Evidence? Evidence? I don’t have to show any stinkin’ evidence.” Indeed, it’s been worse than that: those who demand evidence, or even politely point out the lack of evidence, are branded as heretics by the very same “progressives” who believe religiously that requiring proof of efficacy of drugs is a good thing.

How to square this circle? How to explain this seeming contradiction?

I think it is as plain as the nose on your face. Power. In particular, power exercised by progressive technocratic elites. The FDA acts empower a progressive technocratic elite. Lockdowns and mask mandates empower a progressive technocratic elite–far beyond the wildest dreams of the most zealous FDA bureaucrat. (They also empower idiot politicians who imagine themselves to be part of some elite.) They are both premised on the belief that individuals are incompetent to choose wisely, and must be coerced into making the right choice. Coerced by credentialed elites who are better than you proles.

So an apparent logical inconsistency–proof of efficacy for thee, but not for me–is in fact no inconsistency at all. They are both who, whom. A soi disant elite (ha!) always pushes the alternative that gives them the most power, and deprives you of the most choice. Who (the progressives): Whom (you).

Print Friendly, PDF & Email

April 5, 2021

Justice Thomas Echoes SWP, But Alas Our Proposals Regarding Tech Companies Are Futile In Today’s Corporatist State

Filed under: Economics,Politics,Regulation — cpirrong @ 7:17 pm

Over four years ago, to address social media platforms’ exclusion on the basis of viewpoint (i.e., censorship) I advocated treating them as common carriers subject to a non-discrimination requirement. The thrust of my argument was that these platforms have substantial market power and are subject to weak competitive discipline due to network effects and other technological factors.

In concurring with a Supreme Court decision to deny cert in a case that found Donald Trump violated First Amendment rights by blocking users on Twitter, Justice Clarence Thomas came out strongly in favor of the common carrier approach to regulating Twitter, Facebook, and Google.

Justice Thomas’ reasoning follows mine quite closely:

It changes nothing that these platforms are not the sole means for distributing speech or information. A person always could choose to avoid the toll bridge or train and instead swim the Charles River or hike the Oregon Trail. But in assessing whether a company exercises substantial market power, what matters is whether the alternatives are comparable. For many of today’s digital platforms, nothing is.

Justice Thomas also notes, as I did, that limiting common carriers’ right to exclude is a longstanding element of the American and British legal systems: “our legal system and its British predecessor have long subjected certain businesses, known as common carriers, to special regulations, including a general requirement to serve all comers.” To this, somewhat more perfunctorily Justice Thomas adds more modern public accommodation laws as a restriction on business’ ability to exclude. Common carriage is a narrower conception because it generally requires some market power on the part of the company, and for this reason I find it a superior basis for regulating social media companies. But regardless, this is hardly a radical proposal, and is in fact deeply embedded in law dating from a classical liberal–i.e., laissez faire–period.

Thomas notes that imposing such a restriction is up to the legislature. Alas, that’s not likely, especially given the influence the social media and tech companies have on the legislature, and more ominously, the clearly expressed interest of the party in power to use the social media and tech companies to exclude and censor speech by their political opponents–whom I daresay they consider political enemies, and indeed, beyond the pale and deserving of banishment from the public sphere.

The leftist party in power cannot restrict speech directly–that would violate the First Amendment. And this is where Twitter, Facebook, Google, Amazon etc. can be quite useful to the leftist party in power. As private entities, their exclusion of speech from their platforms does not facially violate 1A. So note with care the pressure that leftist legislators are putting on these companies to police speech even more than they do already. These members of the party in power are outsourcing censorship to ostensibly private entities as a way of circumventing the Constitution.

As their previous behavior indicates, moreover, these companies do not necessarily need much prompting. They are ideologically aligned with the party in power, and are implementing much politically-slanted censorship of their own volition.

This symbiosis between the private businesses and the governing party is the essence of the political-economic model of fascism. At times, the relationship looks like an Escher etching. Like this one in particular:

Which hand is the Democratic Party, and which one is Twitter et al? That is, is the Democratic Party driving social media companies, or are social media companies pulling the strings of the Democratic Party?

The answer is both–like in the Escher. And that is the essence of the political-economic model of fascism. Corporations are acting as political actors, and politicians and those in government are using corporations to advance their political agenda. This is true in any political system, but the symbiosis is far, far stronger in fascist ones, and the antagonisms far weaker than in more liberal polities.

And as we’ve seen in recent months, it’s not just social media and tech companies that are involved. Corporate America generally has adopted a leftist political agenda, is advancing this agenda, and is attempting to pressure governments–especially state governments–to do so as well.

The injection of companies like the major airlines–all of them–and Coca Cola into the Georgia (and now Texas) voting law controversies is the most recent example. But entertainment companies–including professional sports as well as Hollywood, music businesses, etc.–are also exerting substantial political muscle.

Corporatism–a strong symbiotic relationship between government and powerful economic entities, especially corporations–is the essence of fascist economic systems. That is exactly what “capitalism” in the United States is today.

In such a system, the public-private dichotomy does not exist, and libertarians/classical liberals who act as if it does are useful idiots for the corporatists.

This model is also a good characterization of the Chinese system, which although is ostensibly communist, has become clearly corporatist/fascist in the post-Deng era. Interestingly, the main struggle today in China is between the state/Party and large corporations that Xi and his minions believe are too powerful and hence too independent of the state. Even in symbiotic relationships, there is a struggle for power–and for control over the rents.

So while I applaud Justice Thomas for advocating legislation to impose common carrier status on tech behemoths, it must be acknowledged that this proposal is naive in the current environment. The mutual interest between the current party in power and corporate interests in advancing political agendas generally, and suppressing speech in particular (in part because it also helps advance those agendas), is so great that such legislation cannot come to pass today. It is doubtful that it would have come to pass even had Trump won reelection. The slide into corporatism/economic fascism has progressed too far to hold out little hope that it can be reversed, absent some social convulsion.

Print Friendly, PDF & Email

March 15, 2021

Deliver Me From Evil: Platts’ Brent Travails

Filed under: Commodities,Derivatives,Economics,Exchanges,Politics,Regulation — cpirrong @ 6:41 pm

In its decision to change speedily the Dated “Brent” crude oil assessment to include US crude and to a CIF basis, Platts hit a hornets’ nest with a stick and now is running away from the angry hive.

Platts’ attempt to change the contract makes sense. Dated “Brent” is an increasingly, well, dated benchmark due to the inexorable decline in North Sea production volumes, something I’ve written about periodically for the last 10 years or so. At present, only about one cargo per day is eligible, and this is insufficient to prevent squeezes (some of which have apparently occurred in recent months). The only real solution is to add more supply. But what supply?

Two realistic alternatives were on offer: to add oil from Norway’s Johan Sverdrup field, or to add non-North Sea oil (such as West African or US). Each presents difficulties. The Sverdrup field’s production is in the North Sea, but it is heavier and more sour than other oil currently in the eligible basket. West African or US oil is comparable in quality to the current Brent basket, but it is far from the North Sea.

Since derivatives prices converge to the cheapest-to-deliver, just adding either Sverdrup or US oil on a free on board basis to the basket would effectively turn Dated Brent into Dated Sverdrup or Dated US: Svedrup oil would be cheaper than other Brent-eligible production because of its lower quality, and US oil would be cheaper due to its greater distance from consumption locations. So to avoid creating a US oil or Sverdrup oil contract masquerading as a Brent contract, Platts needs to establish pricing differentials to put these on an even footing with legacy North Sea grades.

In the event, Platts decided to add US oil. In order to address the price differential issue, it decided to move the pricing basis from free on board (FOB) North Sea, to a cost, insurance, and freight (CIF) Rotterdam basis. It also announced that it would continue to assess Brent FOB, but this would be done on a netback basis by subtracting shipping costs from the CIF Rotterdam price.

The proposal makes good economic sense. And I surmise that’s exactly why it is so controversial.

This cynical assessment is based on a near decade of experience (from 1989 to 1997) in redesigning legacy futures contracts. From ’89-’91, in the aftermath of the Ferruzzi soybean corner, I researched and authored a report (published here–cheap! only one left in stock!) commissioned by the CBOT that recommended adding St. Louis as a corn and soybean delivery point at a premium to Chicago; in ’95-’96, in the aftermath of a corner of canola, I advised the Winnipeg Commodity Exchange about a redesign of its contract; in ’97, I was on the Grain Delivery Task Force at the CBOT which radically redesigned the corn and beans contracts–a design that remains in use today.

What did I learn from these experiences? Well, a WCE board member put it best: “Why would I want a more efficient contract? I make lots of money exploiting the inefficiencies in the contract we have.”

In more academic terms: rent seeking generates opposition to changes that make contracts more efficient, and in particular, more resistant to market power (squeezes, corners and the like).

Some anecdotes. In the first experience, many members of the committee assigned to consider contract changes–including the chairman (I can name names, but I won’t!)–were not pleased with my proposal to expand the “economic par” delivery playground beyond Chicago. During the meeting where I presented my results, the committee chairman and I literally almost came to blows–the reps from Cargill and ADM bodily removed the chairman from the room. (True!)

The GDTF was formed only because a previous committee formed to address the continued decline of the Chicago market was deadlocked on a solution. The CBOT had followed the tried-and-true method of getting all the big players into the room, but their interests were so opposed that they could not come to agreement. Eventually the committee proposed some Frankenstein’s monster that attempted to stitch together pieces from all of the proposals of the members, which nobody liked. (It was the classic example of a giraffe being a horse designed by committee.). It was not approved by the CBOT, and when the last Chicago delivery elevator closed shortly thereafter, the CFTC ordered the exchange to change the contract design, or risk losing its contract market designation.

Faced with this dire prospect, CBOT chairman Pat Arbor (a colorful figure!) decided to form a committee that included none of the major players like Cargill or ADM. Instead, it consisted of Bill Evans from Iowa Grain, Neal Kottke of Kottke Associates (an independent FCM), independent grain trader Tom Neal, and some outsider named Craig Pirrong. (They were clearly desperate.)

In relatively short order we hashed out a proposal for delivery on the Illinois River, at price differentials reflecting transportation costs, and a shipping certificate (as opposed to warehouse receipt) delivery instrument. After a few changes demanded by the CFTC (namely extending soybean delivery all the way down the River to St. Louis, rather than stopping at Peoria–or was it Pekin?), the design was approved by the CBOT membership and went into effect in 1998.

One thing that we did that caused a lot of problems–including in Congress, where the representative from Toledo (Marcy Kaptor) raised hell–was to drop Toledo as a delivery point. This made economic sense, but it did not go over well with certain entities on the shores of Lake Erie. Again–the distributive effects raised their ugly heads.

The change in the WCE contract–which was also eminently sensible (of course, since it was largely my idea!) also generated a lot of heat within the exchange, and politically within Alberta, Manitoba, and Saskatchewan.

So what did I learn? In exchange politics, as in politics politics, efficiency takes a back seat to distributive considerations. This insight inspired and informed a couple of academic papers.

I would bet dimes to donuts that’s exactly what is going on with Platts and Brent. Platts’ proposal for a more efficient pricing mechanism gores some very powerful interests’ oxen.

Indeed, the rents at stake in Brent are far larger than those even in CBOT corn and beans, let alone tiny canola. The Brent market is vastly bigger. The players are bigger–Shell or BP or Glencore make even 1997 era Cargill look like a piker. Crucially, open interest in Brent-based instruments extends out until 2029: open interest in the ags went out only a couple of years.

My surmise is that the addition of a big new source of deliverable supply (the US) would undercut the potential for delivery games exploiting “technical factors” as they are sometimes euphemistically called in the North Sea. This would tend to reduce the rents of those who have a comparative advantage in playing these games.

Moreover, adding more deliverable supply than people had anticipated would be available when they entered into contracts last year or the year before or the year before . . . and which extend out for years would tend to cause the prices for these longer dated contracts to fall. This would transfer wealth from the longs to the shorts, and there is no compensation mechanism. There would be big winners and losers from this.

It is these things that stirred up the hornets, I am almost sure. I don’t envy Platts, because Dated Brent clearly needs to be fixed, and fast (which no doubt is why Platts acted so precipitously). But any alternative that fixes the problems will redistribute rents and stir up the hornets again.

In 1997 the CBOT got off its keister because the CFTC ordered it to do so, and had the cudgel (revoking contract designation) to back up its demand. There’s no comparable agency with respect to Brent, and in any event, any such agency would be pitted against international behemoths, making it doubtful it could prevail.

As a result, I expect this to be an extended saga. Big incumbent players lose too much from a meaningful change, so change will be slow in coming, if it comes at all.

Print Friendly, PDF & Email

February 22, 2021

GameStop: Round Up the Usual Suspects

Filed under: Clearing,Derivatives,Economics,Politics,Regulation — cpirrong @ 7:52 pm

Shuttling between FUBARs, it’s back to GameStop!

Last week there were House hearings regarding the GameStock saga. As is usual with these things, they were more a melange of rampant narcissism and political posing and outright stupidity than a source of information. Everyone had an opportunity to identify and then flog their favorite villains and push their favorite “solutions.” All in all, very few constructive observations or remedies came out of the exercise. I’m sure you’re shocked.

Here are a few of the main issues that came up.

Shortening the securities settlement cycle. The proximate cause of Robinhood’s distress was a huge margin call. Market participants post margins to mitigate the credit risk inherent in a two day settlement cycle. Therefore, to reduce margins and big margin calls, let’s reduce the settlement cycle! Problem solved!

No, problem moved. Going to T+0 settlement would require buyers to stump up the cash and sellers to secure the stock on the same day of the transaction. Almost certainly, this wouldn’t result in a reduction of credit in the system, but just cause buyers to borrow money to meet their payment obligations. Presumably the lenders would not extend credit on an unsecured basis, but would require collateral with haircuts, where the haircuts will vary with risk: bigger haircuts would require the buyers to put up more of their own cash.

I would predict that to a first approximation the amount of credit risk and the amount of cash buyers would have to stump up would be pretty much the same as in the current system. That is, market participants would try to replicate the economic substance of the way the market works now, but use different contracting arrangements to obtain this result.

I note that when the payments system went to real time gross settlement to reduce the credit risk participants faced through the netting mechanism with daily settlement, central banks stepped in to offer credit to keep the system working.

It’s also interesting to note that what DTCC did with GameStop is essentially move to T+0 settlement by requiring buyers to post margin equal to the purchase price:

Robinhood made “optimistic assumptions,” Admati said, and on Jan. 28, Tenev woke up at 3:30 a.m. and faced a public crisis. With a demand from a clearinghouse to deposit money as a safety measure hedging against risky trades, he had to get $1 billion from investors. Normally, Robinhood only has to put up $2 for every $100 to vouch for their clients, but now, the whole $100 was required. Thus, trading had to be slowed down until the money could be collected.

That is, T+0 settlement is more liquidity/cash intensive. As a result, a movement to such a system would lead to different credit arrangements to provide the liquidity.

As always, you have to look at how market participants will respond to proposed changes. If you require them to pay cash sooner by changing the settlement cycle, you have to ask: where is the cash going to come from? The likely answer: the credit extended through the clearing system will be replaced with some other form of credit. And this form is not necessarily preferable to the current form.

Payment for order flow (“PFOF”). There is widespread suspicion of payment for order flow. Since Robinhood is a major seller of order flow, and since Citadel is a major buyer, there have been allegations that this practice is implicated in the fiasco:

Reddit users questioned whether Citadel used its power as the largest market maker in the U.S. equities market to pressure Robinhood to limit trading for the benefit of other hedge funds. The theory, which both Robinhood and Citadel criticized as a conspiracy, is that Citadel Securities gave deference to short sellers over retail investors to help short sellers stop the bleeding. The market maker also drew scrutiny because Citadel, the hedge fund, together with its partners, invested $2 billion into Melvin Capital Management, which had taken a short position in GameStop.

To summarize the argument, Citadel buys order flow from Robinhood, Citadel wanted to help out its hedge fund bros, something, something, something, so PFOF is to blame. Association masquerading as causation at its worst.

PFOF exists because when some types of customers are cheaper to service than others, competitive forces will lead to the design of contracting and pricing mechanisms under which the low cost customers pay lower prices than the high cost customers.

In stock trading, uninformed traders (and going out on a limb here, but I’m guessing many Robinhood clients are uninformed!) are cheaper to intermediate than better informed traders. Specifically, market makers incur lower adverse selection costs in dealing with the uninformed. PFOF effectively charges lower spreads for executing uninformed orders.

This makes order flow on lit exchange markets more “toxic” (i.e., it has a higher proportion of informed order flow because some of the uninformed flow has been siphoned off), so spreads on those markets go up.

And I think this is what really drives the hostility to PFOF. The smarter order flow that has to trade on lit markets doesn’t like the two tiered pricing structure. They would prefer order flow be forced onto lit markets (by restricting PFOF). This would cause the uninformed order flow to cross subsidize the more informed order flow.

The segmentation of order flow may make prices on lit markets less informative. Although the default response among finance academics is to argue that more informative is better, this is not generally correct. The social benefit of more accurate prices (e.g., does that lead to better investment decisions) have not been quantified. Moreover, informed trading (except perhaps, ironically, for true insider trading) involves the use of real resources (on research, and the like). Much of the profit of informed trading is a transfer from the uninformed, and to the extent it is, it is a form of rent seeking. So the social ills of less informative prices arising from the segmentation of order flow are not clearcut: less investment into information may actually be a social benefit.

There is a question of how much of the benefit of PFOF gets passed on to retail traders, and how much the broker pockets. Given the competitiveness of the brokerage market–especially due to the entry of the likes of Robinhood–it is likely a large portion gets passed on to the ultimate customer.

In sum, don’t pose as a defender of the little guy when attacking PFOF. They are the beneficiaries. Those attacking PFOF are actually doing the bidding of large sophisticated and likely better informed investors.

HFT. This one I really don’t get. There is HFT in the stock market. Something bad happened in the stock market. Therefore, HFT caused the bad thing to happen.

The Underpants Gnomes would be proud. I have not seen a remotely plausible causal chain linking HFT to Robinhood’s travails, or the sequence of events that led up to them.

But politicians gonna politician, so we can’t expect high order logical thinking. The disturbing thing is that the high order illogical thinking might actually result in policy changes.

Print Friendly, PDF & Email

February 21, 2021

Touching the Third Rail: The Dangers of Electricity Market Design

In the aftermath of the Texas Freeze-ageddon much ink and many pixels have been spilled about its causes. Much–most?–of the blame focuses on Texas’s allegedly laissez faire electricity market design.

I have been intensely involved (primarily in a litigation context) in the forensic analysis of previous extreme electricity market shocks, including the first major one (the Midwest prices spike of June 1998) and the California crisis. As an academic I have also written extensively about electricity pricing and electricity market design. Based on decades of study and close observation, I can say that electricity market design is one of the most complex subjects in economics, and that one should step extremely gingerly when speaking about the topic, especially as it relates to an event for which many facts remain to be established.

Why is electricity market design so difficult? Primarily because it requires structuring incentives that effect behavior over both very long horizons (many decades, because investments in generation and transmission are very long lived) and extremely short horizons (literally seconds, because the grid must balance at every instant in time). Moreover, there is an intimate connection between these extremely disparate horizons: the mechanisms designed to handle the real time operation of the system affect the incentives to invest for the long run, and the long run investments affect the operation of the system in real time.

Around the world many market designs have been implemented in the approximately 25 year history of electricity liberalization. All have been found wanting, in one way or another. They are like Tolstoy’s unhappy families: all are unhappy in their own way. This unhappiness is a reflection of the complexity of the problem.

Some were predictably wretched: California’s “reforms” in the 1990s being the best example. Some were reasonably designed, but had their flaws revealed in trying conditions that inevitably arise in complex systems that are always–always–subject to “normal accidents.”

From a 30,000 foot perspective, all liberalized market designs attempt to replace centralization of resource allocation decisions (as occurs in the traditional integrated regulated utility model) with allocation by price. The various systems differ primarily in what they leave to the price system, and which they do not.

As I wrote in a chapter in Andrew Kleit’s Energy Choices (published in 2006) the necessity of coordinating the operation of a network in real time almost certainly requires a “visible hand” at some level: transactions costs preclude the coordination via contract and prices of hundreds of disparate actors across an interconnected grid in real time under certain conditions, and such coordination is required to ensure the stability of that grid. Hence, a system operator–like ERCOT, or MISO, or PJM–must have residual rights of control to avoid failure of the grid. ERCOT exercised those residual rights by imposing blackouts. As bad as that was, the alternative would have been worse.

Beyond this core level of non-price allocation, however, the myriad of services (generation, transmission, consumption) and the myriad of potential conditions create a myriad of possible combinations of price and non-price allocation mechanisms. Look around the world, and you will see just how diverse those choices can be. And those actual choices are just a small subset of the possible choices.

As always with price driven allocation mechanisms, the key thing is getting the prices right. And due to the nature of electricity, this involves getting prices right at very high frequency (e.g., the next five minutes, the next hour, the next day) and at very low frequency (over years and decades). This is not easy. That is why electricity market design is devilish hard.

One crucial thing to recognize is that constraints on prices in some time frames can interfere with decisions made over other horizons. For example, most of the United States (outside the Southeast) operates under some system in which prices day ahead or real time are the primary mechanism for scheduling and dispatching generation over short horizons, but restrictions on these prices (e.g., price caps) mean that they do not always reflect the scarcity value of generating or transmission capacity. (Much of the rest of the world does this too.) As a result, these prices provide too little incentive to invest in capacity, and the right kinds of capacity. The kludge solution to this is to create a new market, a capacity market, in which regulators decide how much capacity of what type is needed, and mandate that load servers acquire the rights to such capacity through capacity auctions. The revenues from these auctions provide an additional incentive for generators to invest in the capacity they supply.

The alternative is a pure energy market, in which prices are allowed to reflect scarcity value–and in electricity markets, due to extremely inelastic demand and periodic extreme inelasticity of supply in the short run, that scarcity value can sometimes reach the $1000s of dollars.

Texas opted for the energy market model. However, other factors intervened to prevent prices from being right. In particular, heavy subsidies for renewables have systematically depressed prices, thereby undercutting the incentives to invest in thermal generation, and the right kind of thermal generation. This can lead to much bigger price spikes than would have occurred otherwise–especially when intermittent renewables output plunges.

Thus, a systematic downward price distortion can greatly exacerbate upward price spikes in a pure energy model. That, in a nutshell, is the reason for Texas’s recent (extreme) unhappiness.

As more information becomes available, it is clear that the initiator of the chain of events that left almost half the state in the dark for hours was a plunge in wind generation due to the freezing of wind turbines. Initially, combined cycle gas generation ramped up output dramatically to replace the lost wind output. But these resources could not sustain this effort because the cold-related disruptions in gas production, transmission, and distribution turned the gas generators into fuel limited resources. The generators hadn’t broken down, but couldn’t obtain the fuel necessary to operate.

It is certainly arguable that Texas should have recognized that the distortion in prices that arose from subsidization of wind (primarily at the federal level) that bore no relationship whatsoever to the social cost of carbon made it necessary to implement the kapacity market kludge, or some other counterbalance to the subsidy-driven wrong prices. It didn’t, and that will be the subject of intense debate for months and years to come.

It is essential to recognize however, that the underlying reason why a kludge may be necessary is that the price wasn’t right due to government intervention. When deciding how to change the system going forward, those interventions–and their elimination–should be front and center in the analysis and debate, rather than treated as sacrosanct.

There is also the issue of state contingent capacity. That is, the availability of certain kinds of capacity in certain states of the world. In electricity, the states of the world that matter are disproportionately weather-related. Usually in Texas you think of hot weather as being the state that matters, but obviously cold weather matters too.

It appears that the weatherization of power plants per se was less of an issue last week than the weatherization of fuel supplies upstream from the power plants. It is an interesting question regarding the authority of ERCOT–the operator of the Texas grid–extends to mandating the technology utilized by gas producers. My (superficial) understanding is that it is unlikely to, and that any attempt to do so would lead to a regulatory turf battle (with the Texas Railroad Commission, which regulates gas and oil wells in Texas, and maybe FERC).

There is also the question of whether in an energy only market generators would have the right incentive to secure fuel supplies from sources that are more immune to temperature shocks than Texas’s proved to be last week. Since such immunity does not come for free, generator contracts with fuel suppliers would require a price premium to obtain less weather-vulnerable supplies, and presumably a liability mechanism to penalize non-performance. The price premium is likely to be non-trivial. I have seen estimates that weatherizing Texas wells would cost on the order of $6-$9 million per well—which would double or more than the cost of a well. Further, it would be necessary to incur additional costs to protect pipelines and gas processing facilities.

In an energy only market, the ability to sell at high prices during supply shortfalls would provide the incentive to secure supplies that allow producing during extreme weather events. The question then becomes whether this benefit times the probability of an extreme event is larger or smaller than the (non-trivial) cost of weatherizing fuel supply.

We have a pretty good idea, based on last week’s events, of what the benefit is. We have a pretty good idea of the cost of hardening fuel supplies and generators. The most imprecise input to the calculation is the probability of such an extreme event.

Then the question of market design–and specifically, whether weatherization should be mandated by regulation or law, and what form that mandate should take–becomes whether generation operators or regulators can estimate that probability more accurately.

In full awareness of the knowledge problem, my priors are that multiple actors responding to profit incentives will do a better job than a single actor (a regulator) operating under low power incentives, and subject to political pressure (exerted by not just generators, but those producing, processing, and transporting gas, industrial consumers, consumer lobbyists, etc., etc., etc., as well). Put differently, as Hayek noted almost 75 years ago, the competitive process and the price system is a way of generating information and using it productively, and has proved far more effective in most circumstances than centralized planning.

I understand that this opinion will be met with considerable skepticism. But note a few things. For one, a regulator’s mistakes have systematic effects. Conversely, some private parties may overestimate the risk and others underestimate it: the composite signal is likely to be more accurate, and less vulnerable to the miscalculation of a single entity. For another, on the one hand skeptics excoriate a regulator for its failures–but confidently predict that some other future regulator will get it right. I’m the skeptic on that.

Recent events also raise another issue that could undermine reliance on the price system. Many very unfortunately people entered into contracts in which their electricity bills were tied to wholesale prices. As a result, the are facing bills for a few days of electricity running into the many thousands of dollars because wholesale prices spiked. This is indeed tragic for these people.

That spike by the way, is up to $10,000/MWh. $10/KWh. Orders of magnitude bigger than you usually pay.

It is clear that the individuals who entered these contracts did not understand the risks. And this is totally understandable: if you are going to argue that regulators or generators underplayed the risks, you can’t believe that they typical consumer won’t too. I am sure there will be lawsuits relating in particular to the adequacy of disclosure by the energy retailers who sold these contracts. But even if the fine print in the contracts disclosed the risks, many consumers may not have understood them even if they read it.

One of the difficulties with getting prices right in electricity markets which has plagued market design is getting consumers to see the price signals so that they can limit use when supply is scarce. But this will periodically involve paying stratospheric prices.

From a risk bearing perspective this is clearly inefficient. The risk should be transferred to the broader financial markets (though hedging mechanisms, for instance) because the risk can be diversified and pooled in those markets. But this is at odds with the efficient consumption perspective. This is not a circle that anyone has been able to square heretofore.

Moreover, the likely regulatory response to the extreme misfortune experienced by some consumers will be to restrict wholesale prices so that they do not reflect scarcity value. That is, an energy only market has a serious time consistency problem: regulators cannot credibly commit to allow prices to reflect scarcity value, come what may. This means that an energy only market may not be politically sustainable, regardless of its economic merits. I strongly suggest that this will happen in Texas.

In sum, as the title of the book I mentioned earlier indicates, electricity market design is about choices. Moreover, those choices are often of the pick-your-poison variety. This means that avoiding one kind of problem–like what Texas experienced–just opens the door to other problems. Evaluation of electricity market design should not over-focus on the most recent catastrophe while being blind to the potential catastrophes lurking in alternative designs. But I realize that’s not the way politics work, and this will be an intensely political process going forward. So we are likely to learn the wrong lessons, or grasp at “solutions” that pose their own dangers.

As a starting point, I would undo the most clearcut cause of wrong prices in Texas–subsidization of wind and other renewables. Alas, even if stopped tomorrow the baleful effect those subsidies will persist long into the future, because they have impacted decisions (investment decisions) on the long horizon I mentioned earlier. But other measures–such as mandated reserve margins and capacity markets, or hardening fuel supplies–will also only have effects over long horizons. For better or worse, and mainly worse, Texas will operate under the shadow of political decisions made long ago. And made primarily in DC, rather than Austin.

Print Friendly, PDF & Email

February 18, 2021

How Low Can Prices Go, and Why?

Filed under: Climate Change,Economics,Energy,Politics,Regulation — cpirrong @ 6:04 pm

A quick follow up to the previous post.

I noted that negative prices have been a thing in Texas for years. Indeed they are in every market with substantial renewables penetration.

This is particularly true in the US, where the Production Tax Credit pays qualifying renewables facilities $23/MWh to produce, regardless of prices. Meaning that a recipient of the PTC will continue to produce even if prices are -$22.99/MWh.

So why do prices go negative? In particular, why do other producers who do not get the credit continue to produce even when there are negative prices? Why don’t enough of them cut output to make sure that prices don’t fall below variable cost?

The answer in a word is: indivisibilities. Or, if you prefer, non-convexities.

Specifically, many thermal generators incur costs to shut down or start up. These are basically fixed costs, of the avoidable variety. A unit currently operating can avoid shutdown costs by continuing to operate. A unit currently idle can avoid startup costs by remaining idle. Minimum run times and ramping constraints are other examples of non-convexities.

So, for example, when demand is low and wind turbines continue to blend birds (and generate electricity), prices can go negative but a gas or coal or nuke plant may continue to operate (and incur fuel costs as well as incremental O&M) because it is cheaper to PAY to sell output (and pay variable costs as well) than it is to shut down.

If the cost of adjusting output of a plant to or from zero was zero, whenever prices fall below marginal operating cost the plant would shut down. This would put a floor on prices equal to marginal cost. However, if there is a fixed cost of adjusting output to or from zero, it can make sense to continue to operate even when prices do not cover variable costs–and when prices are negative–in order to avoid paying this cost of shutting down (and/or the cost of starting back up again when prices are higher).

Generation technology is such that efficient baseload plants (i.e., units with lower per MW variable costs) tend to have higher shutdown and startup costs, and more acute operating constraints that give rise to other forms of non-convexity. As in all things in life, there tends to be a trade-off: low variable costs must be traded off against higher avoidable costs/less flexibility to adjust output. Thus, negative prices hit such units especially hard. They are faced with the bleak choice between paying to sell what they produce, or paying a cost to avoid producing. Obviously this choice is bleaker, the costlier it is to avoid producing. For many, the cost of shutting down is big enough that they continue to spin even when prices are negative.

Economists have long known that non-convexities can interfere with the operation of a price system. If you look at classic Arrow-Debreu proofs of the welfare theorems (i.e., that competitive prices call for the efficient level of production and consumption), you’ll see that they assume that production technologies are convex. That is, they assume away things like shutdown and startup costs. When production technologies are characterized by non-convexities (e.g., fixed avoidable costs), the proofs don’t go through.

Indeed, an equilibrium in prices and output may not exist if indivisibility problems are sufficiently severe: in my earliest academic life, my work on applying core theory focused on this issue. If an equilibrium does exist, it may be inefficient.

Put simply, the invisible hand can get really shaky if indivisibility problems are severe.

Liberalized electricity markets (e.g., PJM and other ISOs) have devised various means of addressing these indivisibilities. The results are not first best, but the mechanisms allow an energy market with prices approximately equal to marginal cost to survive.

The subsidization of wind, especially through the PTC, greatly exacerbates indivisibility/non-convexity problems because its effects fall with particular force on generating units with more pronounced indivisibilities. These tend to be the most efficient, and also the ones most essential for maintaining reliable system operation.

This means that although renewables subsidies punish investment in thermal generation generally, they punish investment in units that operate nearly continuously at low cost with particular severity. Having these units available nearly 24/7/365 is vital for keeping electricity prices low, and for ensuring a highly reliable power system.

So the distortions caused by renewables subsidies, particularly of the pay-to-produce variety, are more severe than “we have too much renewables capacity and too little thermal capacity.” Yes, that’s a problem, but the distorted price signals also distort the types of generation invested in. In particularly, they are particularly punitive to generation with more acute indivisibilities. Since these also tend to be low operating cost, high reliability technologies, that is a very costly distortion indeed.

Print Friendly, PDF & Email

Who Is To Blame for SWP’s (and Texas’s) Forced Outage?

Filed under: Climate Change,Economics,Energy,Politics,Regulation — cpirrong @ 12:44 pm

I am back following a forced outage, due to forced outages of Texas electricity generators caused by the cold snap–brutal by Texas standards, routine compared to what I experienced in my 40+ years up north–that is just relaxing its grip. Having some foresight, I had laid in some firewood, and that kept things from getting unbearable. Other than the power outage, water pressure was an issue: I thought my faucets needed a prostate check, but as of about 10AM the flow is back.

So, with fingers crossed, I have the opportunity to comment on what happened. As with so many things–everything?–today, the commentary has been highly partisan, and largely wrong. Blame wind power (or the lack thereof)! Uh-uh! Blame fossil fuel generation!

The facts are fairly straightforward. In the face of record demand (reflected in a crazy spike in heating degree days)

supply crashed. Supply from all sources. Wind, but also thermal (gas, nuclear, and coal). About 25GW of thermal capacity was offline, due to a variety of weather-related factors. These included most notably steep declines in natural gas production due to well freeze-offs and temperature-related outages of gas processing plants which combined to turn gas powered units into energy limited, rather than capacity limited, resources. They also included frozen instrumentation, water issues, and so on.

Wind was down too. Wind defenders have been saying that wind did great! because it sucked less than ERCOT (the Electricity Reliability Council of Texas) had forecast; that is, wind generation was somewhat higher than the low levels that ERCOT had predicted. The defenders were spinning, even if the turbines were not.

However, wind performance was objectively worse than thermal. In the weeks prior to the Big Freeze, wind was operating at ~50-65 percent of installed capacity, and supply ~40-60 percent of Texas load. When the freeze hit on Monday (and I was throwing another log on the fire), due to turbines freezing, capacity utilization fell to around ~10 percent to ~5 percent, and wind was generating ~3-10 percent of ERCOT load. Meaning that the relative performance of wind vs. thermal was worse during the cold wave, even as bad as thermal performance was. Further meaning that if wind had represented a larger fraction of Texas generating capacity, the situation would have been even grimmer.

The last few days wind defenders have been saying that the problem wasn’t with wind per se, but the failure to winterize adequately wind generators in Texas. After all, there are windmills in Antarctica! (Not to mention Sweden, etc.).

This brings to mind what Adam Smith wrote in the Wealth of Nations:

By means of glasses, hotbeds, and hotwalls, very good grapes can be raised in Scotland, and very good wine too can be made of them at about thirty times the expense for which at least equally good can be brought from foreign countries.

That is, you need to consider cost. Yes, winterizing windmills to withstand the conditions observed in Texas this week is inside the production possibilities frontier, but winterizing is not free. It is a question of whether the benefits exceed the cost.

The same thing is true with regards to thermal generation (and natural gas production). After all, power plants in far colder climes (it was below zero in Missouri, for example) hummed along in even more frigid conditions. Similarly, gas continues to flow every year in winter conditions in Canada and Siberia. But achieving these results is not free. It is a question of cost vs. benefit.

The cost of not winterizing power plants that shut down due to temperature-related outages (rather than limitations on fuel supply) were certainly material. Power prices spiked to around $9,000/MWh, and were routinely over $1,000/MWh. For a 500MW plant, losing an hour at a $9000 price means $4.5 million in revenue forgone. Even at $1,000, that’s $500K up the flue. (That’s the gross loss. The net loss is harder to calculate, given that natural gas prices also spiked).

That’s a lot of money, but whether it would have been worthwhile to incur the cost to ensure operation under the conditions we observed also depends on the probability of the event. Given the extremes observed, the probability is pretty small. Meaning that it might have been rational for generators to forego the expense: zero failure rate is never optimal. This is in contrast to a generator in say Minnesota, for which such conditions are the norm.

I would imagine that there will be a pretty intense review of utilities’ decisions regarding winterizing their plants. The cost should be fairly easy to estimate. By applying market prices or the value of lost load (VOLL) it should be similarly straightforward to estimate the cost of such weather induced outages. The probability, however, will be much harder. It is inherently difficult to estimate the probability of extreme events, especially when they are seasonal in nature.

Similar considerations hold for gas processing plants and gas wells. The opportunity cost, and the cost of upgrades, are fairly straightforward to quantify. The probability that the upgrades will actually pay off (by avoiding shutdowns) is far more amorphous.

The events of this week also bring to the fore longstanding debates regarding the appropriate generation mix in Texas. Yes, thermal experienced unprecedented outages, but as noted above, it performed both absolutely (measured by capacity utilization) and relatively (measured by decline in utilization) better than wind. Texas would have been better off with less wind and more thermal. Maybe not enough to avoid blackouts altogether, but enough to mitigate substantially their severity.

Texas has had longstanding concerns about reserve margins. The main drivers have been the retirement of substantial amounts of coal generating capacity, and relatively low rates of increase in natural gas generation (a measly 3.5 percent over the past 4 years) at the same time wind capacity has more than doubled and solar capacity has increased by 2000 percent.

The problems here are twofold. First, wind and solar availability and output are often negatively correlated with demand. (Solar wasn’t doing much at 10PM on Monday, now was it?) Second, and more insidiously, wind and solar generation depress prices–often to below zero–at other times, which undermines the economics of thermal generation. Hence, the low rate of investment in gas, and the actual disinvestment in coal.

As I said, this is a longstanding problem. I remember hosting a roundtable on this issue at UH in 2005 or 2006. Generators were already raising alarms that negative prices were a powerful disincentive to investment.

Things have only worsened since, and perverse policy is to blame. It is unarguable that wind and solar capacity have increased to extremely inefficient levels due to lavish subsidies, especially at the federal level. As a result, Texas has a grotesquely inefficient resource mix.

And with the new administration, the outlook is even worse. It has embraced increasing demand for electricity (electrify everything!–echoing the malign and evil Bill Gates) and subsidizing the production of electricity using unreliable renewables.

Texas’s travails raise questions about the viability of ERCOT’s “energy only” market design, in which generator revenues are solely from the sale of energy (or ancillary services). In this model, price spikes are intended to incentivize investment in generation (and upgrades to enhance availability rates). But price signals distorted by excessive renewables are a strong disincentive to investment.

The standard kludge in these circumstances is capacity requirements plus a capacity market. This was mooted in my roundtable so many years ago. If price signals are allowed to work, a capacity market is unnecessary and inefficient. But prices have been so distorted that it will receive serious attention going forward.

This is unfortunate, in the extreme, as the better approach would be to destroy the price distortions at their source–subsidies for renewables. Alas, in the current political environment it is likely that the nation will move strongly in the opposite direction, making the problem worse not better. Perhaps Texas could find ways of counteracting national policies–e.g., by imposing a state “reliability tax” on renewables–but this is likely to be politically impossible (although it would be a nice illustration of the theory of the second best!) Meaning that in the end, we will kludge our way to increasing reserve margins.

Not a cheery picture, but what is these days?

Print Friendly, PDF & Email

February 1, 2021

Battle of the Borgs

Filed under: Clearing,Commodities,Derivatives,Economics,Exchanges,Regulation — cpirrong @ 6:39 pm

One metaphor that might shed some light on how seemingly small events can have cascading–and destructive–effects in financial markets is to think of the financial system as consisting of borgs programmed to ensure their survival at all costs.

One type of borg is the clearinghouses/CCP borg. The threat to them is the default of their counterparties. They use margins to protect against these defaults (thereby creating a loser pays/no credit system). When volatility increases, or gap risk increases, or counterparty concentration risk increases–or all three increase–the CCP Borg responds to this greater risk of credit loss by raising margins–sometimes by a lot–in order to protect itself.

This puts other borgs (e.g., Hedge Fund Borgs) under threat. They try to borrow money to pay the CCP Borg’s margin demands. Or they sell liquid assets to raise the cash.

These actions can move prices more–including the prices of things that are totally different from what caused the CCP Borg to raise margins on. This can cause increases in volatility that triggers reactions by other Managed Money Borgs. For example, these Borgs may utilize a Value-at-Risk system to detect threats, and which is programmed to cause the MM Borg to reduce positions (i.e., try to buy and sell stuff) in order to reduce VaR, which can move prices further, triggering more volatility. Moreover, the simultaneous buying and selling of a lot of various things by myriad parties can affect correlations between prices of these various things. And correlation is an input into the borgs’ model, so this can lead to more borg buying and selling.

All of these price changes and volatility changes can impact other borgs. For example, increases in volatilities and correlations in many assets that results from Managed Money Borgs’ buying and selling will feed back to the CCP Borgs, whose self-defense models are likely to require them to increase their margins on many more instruments than they increased margins on in the first place.

This is how seemingly random, isolated shocks like retail trader bros piling into heavily shorted, but seemingly trivial, stocks can spill over into the broader financial system. Borgs programmed to survive, acting in self-defense, take actions that benefit themselves but have detrimental effects on other borgs, who act in self-defense, which can have detrimental effects on other borgs, and . . . you get the picture.

This is a quintessential example of “normal accidents” in a complex system with tightly coupled components. Other examples include reactor failures and plane crashes.

I note–again, reprising a theme of the Frankendodd Years of this blog–that clearing and margins are a major reason for tight coupling, and hence greater risk of normal accidents.

I note further that it is precisely the self-preservation instincts of the borgs that makes it utterly foolish and clueless to say that creating stronger borgs with more powerful tools of self-preservation, and which interact with other borgs, will reduce systemic risk. This is foolish and clueless precisely because it is profoundly unsystemic thinking because it views the borgs in isolation and ignores how the borgs all interact in a tightly coupled system. Making borgs stronger can actually make things worse when their self-preservation programs kick in, and the self-preservation of one borg causes it to attack other borgs.

Why do teenagers in slasher flicks always go down into the dark basement after five of their friends have been horribly mutilated? Well, that makes about as much sense as a lot of financial regulators have in the past decades. Despite literally centuries of bad historical experiences, they have continued to try to make stronger, mutually interacting, borgs. Like Becky’s trip down the dark basement stairs, it never ends up well.

Print Friendly, PDF & Email

January 29, 2021

GameStop-ped Up Robinhood’s Plumbing

The vertigo inducing story of GameStop ramped it up to 11 yesterday, with a furore over Robinhood’s restriction of trading in GME to liquidation only, and the news that it had sold out of its customers’ positions without the customers’ permission. These actions are widely perceived as an anti-populist capitulation to Big Finance.

Well, they are in a way–but NOT the way that is being widely portrayed. What is going on is an illustration of the old adage that clearing and settlement in securities markets (like the derivatives markets) is like the plumbing–you take it for granted until the toilet backs up.

You can piece together that Robinhood was dealing with a plumbing problem from a couple of stories. Most notably, it drew down on credit lines and tapped some of its big executing firms (e.g., Citadel) for cash. Why would it need cash? Because it needs to post margin to the Depositary Trust Clearing Corporation (DTCC) on its open positions. Other firms are in similar situations, and directly or indirectly GME positions give rise to margin obligations to the DTCC.

The rise in price alone increased margin requirements because given volatility, the higher the price of a stock, the larger the dollar amount of potential loss (e.g., the VaR) that can occur prior to settlement. This alone jacks up margins. Moreover, the increase in GME volatility, and various adders to margin requirements–most notably for gap risk and portfolio concentration–ramp up margins even more. So the action in GME has led to a big increase in margin requirements, and a commensurate need for cash. Robinhood, as the primary venue for GME buyers, had/has a particularly severe position concentration/gap problem. Hence Robinhood’s scramble for liquidity.

Given these circumstances, liquidity was obviously a constraint for Robinhood. Given this constraint, it could not handle additional positions, especially in GME or other names that create particularly acute margin/liquidity demands. It was already hitting a hard constraint. The only practical way that Robinhood (and perhaps other retail brokers, like TDAmeritrade) could respond in the short run was trading for liquidation only, i.e., allow customers to sell their existing GME positions, and not add to them.

By the way, trading for liquidation is a tool in the emergency action toolbook that futures exchanges have used from time-to-time to deal with similar situation.

To extend the plumbing analogy, Robinhood couldn’t add any new houses to its development because the sewer system couldn’t handle the load.

I remember some guy saying that clearing turns credit risk into liquidity risk. (Who was that guy? Pretty observant!) For that’s exactly what we are seeing here. In times of market dislocation in particular, clearing, which is intended to mitigate credit risk, creates big increases in demand for liquidity. Those increases can cause numerous knock on effects, including dislocations in markets totally unrelated to the original source of the dislocation, and financial distress at intermediaries. We are seeing both today.

It is particularly rich to see the outrage at Robinhood and other intermediaries expressed today by those who were ardent advocates of clearing as the key to restoring and preserving financial stability in the aftermath of the Financial Crisis. Er, I hate to say I told you so, but I told you so. It’s baked into the way clearing works, and in particular the way that clearing works in stressed market conditions. It doesn’t eliminate those stresses, but transfers them elsewhere in the financial system. Surprise!

The sick irony is that clearing was advocated as a means to tame big financial institutions, the banks in particular, and reduce the risks that they can impose on the financial system. So yes, in a very real sense in the GME drama we are seeing the system operate to protect Big Finance–but it’s doing so in exactly the way many of those screaming loudest today demanded 10 years ago. Exactly.

Another illustration of one of my adages to live by: be very careful what you ask for.

Margins are almost certainly behind Robinhood’s liquidating some customer accounts. If those accounts become undermargined, Robinhood (and indeed any broker) has the right to liquidate positions. It’s not even in the fine print. It’s on the website:

If you get a margin call, you need to bring your portfolio value (minus any cryptocurrency positions) back up to your minimum margin maintenance requirement, or you risk Robinhood having to liquidate your position(s) to bring your portfolio value (minus any cryptocurrency positions) back above your margin maintenance requirement.

Another Upside Down World aspect of the outrage we are seeing is the stirring defenses of speculation (some kinds of speculation by some people, anyways) by those in politics and on opinion pages who usually decry speculation as a great evil. Those who once bewailed bubbles now cheer for them. It’s also interesting to see the demonization of short sellers–whom those with average memories will remember were lionized (e.g., “The Big Short”) for blowing the whistle on the housing boom and the bank-created and -marketed derivative products that it spawned.

There are a lot of economic issues to sort through in the midst of the GME frenzy. There will be in the aftermath. Unfortunately, and perhaps not surprisingly given the times, virtually everything in the debate has been framed in political terms. Politics is all about distributive effects–helping my friends and hurting my enemies. It’s hard, but as an economist I try to focus on the efficiency effects first, and lay out the distributive consequences of various actions that improve efficiency.

What are the costs and benefits of short selling? Should the legal and regulatory system take a totally hands off approach even when prices are manifestly distorted? What are the costs and benefits of various responses to such manifest price distortions? What are the potential unintended consequences of various policy responses (clearing being a great example)? These are hard questions to answer, and answering them is even harder in the midst of a white-hot us vs. them political debate. And I can say with metaphysical certainty that 99 percent of the opinions I have seen expressed about these issues in recent days are steeped in ignorance and fueled by emotion.

There are definitely major problems–efficiency problems–with Big Finance and the regulation thereof. Ironically, many of these efficiency problems are the result of previous attempts to “solve” perceived problems. But that does not imply that every action taken to epater les banquiers (or frapper les financiers) will result in efficiency gains, or even benefit those (often with justification) aggrieved at the bankers. I thus fear that the policy response to GameStop will make things worse, not better.

It’s not as if this is new territory. I am reminded of 19th century farmers’ discontent with banks, railroads, and futures trading. There was a lot of merit in some of these criticisms, but all too often the proposed policies were directed at chimerical wrongs, and missed altogether the real problems. The post-1929 Crash/Great Depression regulatory surge was similarly flawed.

And alas, I think that we are doomed to repeat this learning the wrong lessons in the aftermath of GameStop and the attendant plumbing problems. Virtually everything I see in the public debate today reinforces that conviction.

Print Friendly, PDF & Email

January 28, 2021

Let the GameStop Games Begin!

Filed under: Economics,Exchanges,History,Politics,Regulation — cpirrong @ 9:33 am

Short sellers have been hate objects since the earliest days of the U.S. stock market–witness the checkered lives of the likes of Daniel Drew or Jacob Little. It is therefore no surprise that the travails of their latter day descendants–hedge funds like Melvin Capital–that have resulted from the huge runups in the prices of stocks like GameStop ($GME) have been the source of considerable schadenfreude. I would suggest, however, that this will end in tears not just for the hedgies, but for those who contributed to their massive losses.

Long story short (no pun intended).  Small investors pile in a stock (GME, and some others like Blackberry), driving up its price.  Hedge funds think the stock is overpriced, so they go short.  A group of small investors thinks that this is an opportunity to punish the short sellers (a lot of mutual disdain/hate here), so via the reddit group WallStreetBets they coordinate to buy more, driving up the price further.  This imposes big losses on the shorts, who buy to cover, driving up the price further, imposing more losses on the remaining shorts, driving them to cover, etc., etc. 

It brings to mind an old doggerel poem from the Chicago Board of Trade in the 19th century:

He who buys what isn’t his’n, Must buy it back or go to prison.

In the case of GameStop, the price action went hyperbolic:

That chart ends at yesterday’s close. Things have been even more crazy overnight, with the price hitting $500/share. There have been gyrations caused by the shutdown of the chatrooms and some retail platforms stopping trading in this and other heavily shorted stocks. But the fundamental dynamic in play now–shorts slitting their own throats in panicked buying to cover–means that attempts to constrain the long herd will not have a lasting impact.

The short interest that had to (and has to) be covered is huge–short interest in GME was 140 percent of outstanding shares–and a larger share of the float. (How can there be more shorts than shares? The same share can be borrowed and lent multiple times!) The effects of the short covering are seen not only in the price, but in the stratospheric cost of borrowing shares. Earlier this week it was about 30 percent–juice loan territory. Now it is at 100 percent.

In many respects, this is reminiscent of some of the more storied episodes in Wall Street history, or more recently the 2008 VW corner which punished shorts severely. But there is a major difference. In some of the earlier episodes (including major corners of shorts in railroad stocks in the 19th century, or battles between shorts and stock pools in the 1920s, or the VW case), there was a single dominant long squeezing the overextended shorts. Here, it seems that the driving force is a relatively large group of small longs, acting with a common purpose.

How will it end? Well, the stock is obviously overvalued, and driven by “technical factors” (as is sometimes said euphemistically). It will crash to earth. When? Well, when the shorts get out. Who will lose? Well, the shorts are likely a big portion of the purchasers at these nosebleed levels, so they will be the biggest losers. But there will be some latecomers and trend followers who will have followed the Pied Piper of rising price, and will lose in the inevitable crash.

Should we really care? There is some possibility that the disruption in GME and other heavily shorted stocks could have knock-on effects. Hedge funds suffering large losses may have to dump other positions, causing those prices to decline. (The events surrounding the Northern Pacific corner, for example, sparked the Panic of 1901.)

One fascinating aspect of this is how it demonstrates the deep populist discontent that is in abroad in the land. The hedge fund laments have been met with a barrage of scorn and ridicule, with a major theme being “you a$$h0les got bailed out in 2008 while the little guy got hammered–how you likin’ it now?” Completely understandable. Revenge of the nerds, as it were.

But, alas, I do not think the visceral satisfaction will last. Things like this inevitably result in litigation. The WallStreetBets lot are in for major lawsuits filed by the losing hedge funds, and perhaps others (e.g., investors who had sold call options).

Following the trend and herd trading is not manipulation–as long as the herd doesn’t explicitly coordinate with the intent to move the price to uneconomic levels. However, many on WallStreetBets expressed an intent to drive up the price in order to impose losses on their bêtes noires, and apparently coordinated their buying activity to achieve this result. Intent and cooperation make the manipulation. Note that the explicit communication and coordination could also transform this into a Section 1 Sherman Act claim–with the attendant triple damages.

Now the hedge funds will never collect even a fraction of their losses. But for them, the process will be the punishment inflicted on their foes. Pour encourager les autres.

The SEC is not committing to any action right now. It merely says it is “monitoring” the situation. The DOJ has also been silent.

However, they will be under tremendous pressure to act. Ultimately, the decision will be political–precisely because of the political nature of the populist resentment. The hedge funds and Wall Street generally will be howling for the government to file cases. But if the government does so, there will be widespread popular outrage that the government is taking the side of the Wall Street elite. Again.

This will be the first thing on Gary Gensler’s plate at the SEC. He is in a no win situation. (Breaks me all up.)

In sum, the events of the past days have been fascinating from both an economic and a political perspective. They represent a back-to-the-future moment of colossal battles between longs and shorts, but with a major twist: whereas the historical battles tended to be between colossi, this one pits an army of Davids against a few colossal hedge funds. This in turn gives rise to a political narrative, which again has historic echoes–the little guy vs. Financial Capital. It’s like the 19th century, all over again.

The battle will play out for some time. For a few days or weeks in the markets, and in the courts for years after that.

Print Friendly, PDF & Email

Next Page »

Powered by WordPress