Streetwise Professor

November 30, 2009

A Deficit of Trust

Filed under: Military,Politics,Russia — The Professor @ 9:36 pm

The derailment of the Nevsky Express is a terrible event, especially if it is proven that it was an act of terrorism.  Whatever the cause, it was made even more terrible by the inept response and lack of preparation for such a disaster.  My condolences to the victims and their families.

The government has indicated that the most likely culprits are Chechen separatists, and this is a reasonable conjecture.  But, as this article from WindowOnEurasia about the some commentary in the Russian press indicates, one also cannot reject out of hand more disturbing possibilities:

Tomorrow, December 1, is the 75th anniversary of the murder of Sergey Kirov, an action that Russian commentators continue to refer to as “the Stalinist version of [Hitler’s] Reichstag fire” because it opened the way to the purges and the great terror of the following years.

But what is even more disturbing now three-quarters of a century later is that, as one Moscow observer put it today, in Russia “the Reichstags burn and burn” because neither in the case of Kirov nor in that of so many other tragedies in that country has there been a full and honest reporting by the government or by authoritative people about what happened.

And because of the lack of such an honest accounting of events, Aleksandr Ryklin writes in today’s “Yezhednevny zhurnal,” thoughtful Russians would need to be presented with “convincing evidence” that the special services did not blow up the “Nevsky Express” this week in the service of the powers that be (www.ej.ru/?a=note&id=9673).

“For me personally,” Ryklin says, “the most terrible result of the tragedy with the ‘Nevsky Express’ (after the death of people, of course) is the absence of any hope in the foreseeable future to find out the TRUTH about what happened. Because I will never believe THEM. And not one sober and thoughtful person in Russia will ever believe them.”

As he continues, “hundreds of times we have caught THEM in a monstrous lie” – about Nord Ost, about Beslan, about the Kursk, to name but the first three that come to find. “THEY are cynical, unprincipled and pitiless,” and they assume that Russians will swallow “any version” of events they choose to dish out.

But tragically, given Russian history, “if we don’t get it from THEM, then what means do we have to find out the truth about what has taken place? Is there any political organization in Russia which would be bold enough to say: ‘We will create an independent experts commission which with time will present its conclusions to the court of public opinion’?”

“In 1999,” Ryklin recalls, [he] “could not image that the special services were involved in the blowing up of the apartment buildings in Moscow. And even after the case of the hexogen in Ryazan [where a television crew filmed what appeared to be a bomb planted by the authorities] did not shake his conviction. I then rejected that version with anger and disgust.”

“But the years have passed, and my naiveté,” Ryklin acknowledges, “gradually has dissipated … so that now one would have to apply definite efforts in order to convince me that the Russian special services did not blowup a peaceful train. And show me evidence of their non-involvement … Convincing evidence.”

That thoughts of Ryazan come to mind when reading of terrorism in Russia reflects the weight of history, stretching back well beyond Kirov.  A state built on suspicion feeds suspicion of it.  A state that operates in secrecy in things large and small creates the atmosphere in which such suspicion thrives.  No state can be fully trusted, but a state that is separate and above its citizens, and which is not subject to any real accountability, is especially untrustworthy.  And as Ryklin writes, from the distant past to the recent actions in the age of Putin, the trust deficit in Russia is a yawning one.

I am not saying that security service involvement is the most likely explanation.  I am not saying that it is likely period.  I am just saying that it is far more likely in Russia than just about anywhere else; that it is impossible to dismiss this possibility out of hand.

Providing Incentives to Collect Data

Filed under: Climate Change,Uncategorized — The Professor @ 8:00 pm

Commentor Scott raises the issue of the incentive to collect data, particularly large data sets, if it is imperative to make said data public.  This is a very important issue that I addressed a bit in my first post on the CRU fiasco, but it’s worth a couple of additional thoughts.

Here are some ways to provide incentives:

  • Subsidies, in the form of grants or prizes.  This is already done to some degree.  Receipt of such funding to support data collection–i.e., the subsidization of a public good–should be contingent on making this data freely available.
  • The provision of complementary services for a fee, such as consulting on the use of the data, or creation of customized data sets.  This is the source of revenues for many providers of open source software.
  • Licensing or sale of the data for a reasonable fee.  This has been quite effective for many databases widely used by finance and economics academics (and industry), notably the CRSP stock and bond data (but others as well).
  • Conditioning access to data on formal recognition in all working papers and published papers, perhaps including the creation of a “data authorship” category of something of the sort.

In brief, a variety of mechanisms can provide incentives to create information/data.  A variety of mechanisms are utilized in markets for other information goods.  The just listed are some of the most well known.

Intellectual property rights (e.g., granting rights of exclusive use, trade secrets) are other means by which creators can capture a stream of benefits, thereby giving them an incentive to produce information goods.  These may be the efficient arrangement in some settings, but I am dubious that that is the case in the sciences and social sciences.  Science depends on replication, which is incompatible with “I’d show you but I’d have to kill you” secrecy.

I also had an additional thought regarding how to evaluate scholarship for hiring, promotion, tenure, and salary decisions in a research university in an open source, non-journal dominated system.  I mentioned the idea of using citations as a main metric, and evaluating the “quality” of citations based on characteristics of the works in which a citation occurs–such as the number of times the citing author’s work is itself cited.  This is essentially a links-based metric.  Major search engines, notably Google, utilize such metrics in ranking web sites to determine display order.  The concepts underlying search engine ranking algorithms could perhaps be adapted to rank scholarly impact as well.

The creation of a data set that is utilized by other scholars could be an input to the ranking algorithm, providing another incentive to invest in their creation.

Presumably the CRU employees associated with the creation of a climate data set would have achieved a stratospheric impact factor/ranking under such a system because myriad other scholars wold have used the data.  But of course, this would have come at a (private) cost: they couldn’t have controlled the conclusions of the work done with their data (which the emails suggest was an important consideration to them).  But that private cost is swamped by the benefit of permitting open access to other researchers, so the proper response is: tough luck to you.

November 29, 2009

Let (Peer Reviewed) Publishing Perish?

Filed under: Climate Change,Economics — The Professor @ 10:59 pm

The whole CRU disaster has got me thinking about whether given modern information technology, peer review is an anachronism that impedes, rather than advances, scientific knowledge (including social scientific knowledge).  It is quite entertaining–in a perverse, watching a car crash kind of way–to observe the defenders of the climate change consensus repeat “peer review” like a magic spell that will somehow ward off evil (skeptic) spirits.  But if anything, the whole fiasco calls into question the reliability of peer review.

Indeed, the whole display brings to mind the comments of my former colleague, the late Roger Kormendi.  Somebody once mentioned to Roger that he should pay particular attention to a certain piece of research because it had been peer reviewed.  To which Roger replied: “Oh, that means it’s completely arbitrary.”

But as with any institution, it’s not sufficient to point out the flaws in one to justify its replacement with another.  It is necessary to make a comparative analysis with realistic alternatives.  What would be the alternatives to peer review?

In the modern era, the ability to disseminate papers nearly universally and instantaneously, and to make people aware of their existence through things like SSRN makes it possible for myriad scholars to access and evaluate papers, rather than one or two reviewers.  Moreover, the same information technology makes it feasible to provide access to data and code to facilitate replication, examinations of robustness, and the testing of alternative specifications and models on a given set of data.  In this way, it is possible to harness the knowledge of myriad, dispersed individuals with specialized expertise, rather than one or two or even three individuals.  Moreover, the open entry model mitigates the incentive problems associated with peer review, where individuals with weak and often perverse incentives exert incredible influence over what gets published and what doesn’t.

Furthermore, wiki-type mechanisms can be employed to collect and aggregate commentary and critiques about particular papers or groups of papers.  This is another way of harnessing the dispersed knowledge of crowds.  In particular, it would facilitate the exploitation of comparative advantage, allowing, for instance, statisticians to comment on statistical methodologies, computational experts to critique numerical techniques, and so on.

Just think of how things might have evolved differently if climate data and paleoclimate reconstruction had been done under this model, rather than via the peer review mechanism.  The “hockey stick” reconstructions would have been subjected to the critique of expert statisticians who would have uncovered Mann et al’s misuse of principal component methods.  Making raw climate data available would have made it possible to evaluate the sensitivity of results to data selection and methods used to “clean” (quotes definitely needed) data.*  In sum, we wouldn’t be where we are now, not knowing with any confidence just what the climate record tells us, or even what the climate record actually is.

The open kimono approach with code and data would also provide an extremely strong deterrent to fraud.  Moreover, reducing reliance on journals would reduce resources devoted to rent seeking activities (e.g., influencing journal editors, gaming submissions, torpedoing competitors, spending time devising submission strategies).  It would also enhance competition, and reduce the rents that incumbent “gatekeepers” can extract.  Reduced reliance on journals would also mitigate the file drawer effect because journals inevitably condition acceptance on measures of statistical significance.  This leads scholars to abandon research that does not generate such results and encourages selection searches and other “econometric cons,” meaning that published results are likely to present a biased picture of the true state of the evidence.  A more open model would likely reduce these (statistical) size distorting incentives.

One challenge posed by this alternative model relates to the fact that the hiring, tenure, and promotion mechanisms at modern research universities are adapted to the journal publishing mechanism.  Since citations are probably a better metric of quality than whether or not something is published in Journal A or Journal B, or published at all, perhaps a citation-based mechanism could suffice.  (Though if journals faded in importance, this would raise the questions: Cited in what? and How do you compare the quality of citations?  Perhaps some metric where the number of citations of the paper in which a particular work is cited could be used to determine citation quality.)  Also, since participation in wikis, etc., contributes to knowledge, it would be desirable to provide incentives for that kind of activity–which would inevitably require some (inevitably noisy) measurement technology.

These hiring and P&T issues are not immaterial, but to me the first-order issues are reducing the costs of producing knowledge, discovering error, and deterring fraud.  The open source, wisdom of swarms, collaborative, Wiki-based model seems to offer many advantages over the received, hierarchical, journal-based model.  Open access, open source, “swarm,” and wiki models are threatening other information dissemination mechanisms–notably journalism.  So why not journals too?  Why not have reviews by tens or hundreds or thousands of peers who bring to bear comparative advantages (e.g., statisticians critiquing work done by non-statisticians but employing statistical techniques), and who are self-selected for their interest, rather than reviews by less than a handful of inevitably distracted, sometimes conscripted, and often conflicted peers?

The whole journal-based, peer review process is arguably well adapted to a particular technology for producing and disseminating information.  Given the radical changes in information technology, it is at least worth considering whether this received mechanism is still optimal.  I, for one, have serious doubts.

* As a (relevant) aside, one of the most outrageous admissions to come from the Hadley CRU fiasco is that (a) original source data was allegedly destroyed some time ago, and (b) East Anglia University/Hadley have the audacity to claim that only “value added,” processed data was retained.

The arrogance of this claim is beyond belief.  We are supposed to accept that CRU’s methods maximized “value added” for all possible uses of the data?  That every one of the myriad choices that CRU made when processing, filtering, and adjusting the data was the right one for every possible use of the data, and beyond question, let alone reproach?  We should just take this on faith?

How can we test this remarkable assertion?

Oh, we can’t–because they destroyed what would be necessary to do so.

Just think of the hundreds of possible ways of transforming the raw data to deal with problems such as missing observations, or aggregating individual station data to characterize climate over wide areas.  Hadley made a set of choices, and due to their destruction of the original data, we have to live with that, perhaps forever.

Who the hell died and made them the last word?

With open source data and open source code, we would not have to live with the systemic risk inherent in relying on a single set of choices.  Maybe the choices were right–but if they are not, our ability to change adapt is severely constrained.  And maybe they were right at a particular time, but we are now saddled with choices made in light of the techniques available when they were made; it is impossible to bring new techniques to bear on the old data.

Value added my foot.  Who the hell is Hadley to make that assertion?  Just a supercilious, self-serving effort at CYA.

As #1 SWP daughter said in a discussion of these issues: “PAY NO ATTENTION TO THAT LITTLE MAN BEHIND THE CURTAIN!”  That analogy is spot on.

Code Breaking–or, Broken Code

Filed under: Climate Change — The Professor @ 9:41 am

In a couple of excellent posts, Shannon Love at Chicago Boyz notes that one of the most disturbing revelations resulting from the ripping open of the Hadley CRU’s kimono is the shockingly bad, ad hoc, sloppy, and (fill in own pejoratives here) nature of the computer code underlying the quantitative work that is such an important prop for the entire climate change policy edifice.  Love points out that software is not peer reviewed, and that scientists are for the most part self-taught programmers who do not follow the strict protocols associated with commercial software development.  For an endeavor like that undertaken at Hadley, incremental changes are made on the fly with little documentation, and soon the code resembles a rat’s nest, or an overgrown, weed-choked garden.

The code of the Hadley folks and their confreres (or should it be co-conspirators?) is mainly related to data preparation and analysis.  Many of the tasks it performs are relatively pedestrian in concept; the difficulties arise from dealing with the messiness of the underlying data (and, arguably, the perceived necessity of fitting the data to the theory).

But it does raise questions in my mind about the other major prop of the climate change policy edifice: dynamic climate change models.  These are huge and complex.  I know from much personal experience on simpler but related problems in finance that the kinds of equations they are intended to solve are extremely touchy.  Solution techniques can be very brittle.  Errors can be subtle and hard to catch.

It is my understanding that this code, like the Hadley programs, is written by scientists.  So, my questions: what is the quality of the climate model code?  Is it documented properly?  Has it been tested?  Has it been audited?  By whom?  What confidence can we have in its reliability? (Reliability in the relatively simple sense that it is bug-free, and properly performs the calculations implied by the underlying theories it is intended to implement.  The reliability and completeness of the underlying theories–relying, as they do, on “fudge factor” parameterizations and incomplete characterizations of potentially first order phenomena like clouds–are other issues altogether.)

November 28, 2009

An argument is a connected series of statements intended to establish a proposition.

Filed under: Commodities,Derivatives,Economics,Exchanges,Financial crisis,Politics — The Professor @ 6:20 pm

So sayeth Michael Palin, in the priceless Monty Python Argument Clinic Sketch.  In his recent testimony before the Senate Ag Committee, CFTC Chairman Gary Gensler makes an unconnected series of assertions (many categorically untrue) intended to establish a proposition.  Here’s the SWP annotated version:

First, standard OTC transactions should be required to be cleared by robustly regulated  central counterparties. [That’s the proposition to be “proved.”] By guaranteeing the performance of contracts submitted for  clearing, the clearing process significantly reduces systemic risks.  [Unsupported assertion.  Also not necessarily true.]  Through the  discipline of a daily mark-to-market process, the settling of gains and losses and the  imposition of independently calculated margin requirements, regulated clearinghouses  ensure that the failure of one party to OTC derivatives contracts will not result in losses  to its counterparties.  [Categorically untrue for multiple reasons.  OTC derivatives typically marked-to-market; indeed, the AIG debacle that Chairman Gensler’s trots out repeatedly to scare people into supporting him came to a head due to mark-to-market.  Moreover, mark-to-market provides no guarantee that third parties, including clearing member customers and other members of the clearinghouse, will not incur losses.]  Right now, however, trades mostly remain on the books of large  complex financial institutions.   These institutions engage in many other businesses,  such as lending, underwriting, asset management, securities, proprietary trading and  deposit-taking.  [So?  What are the logical consequences of this?  Mightn’t there be economies of scope in these activities?  If there aren’t, why has the market evolved in this way?  Could it be the case that offering a broad array of financial services to its customers a financial institution generates information that lowers the costs of intermediation?]  Clearinghouses, on the other hand, are solely in the business of  clearing trades.  [So? again.  This is a completely, utterly unsupported assertion related to the economies of scope, or lack thereof, in financial intermediation.  Again begs the question of why, if this is the efficient arrangement, it didn’t prevail in the competition of the marketplace.] To reduce systemic risk, it is critical that we move trades off of the  books of large financial institutions and into well-regulated clearinghouses. [Conclusion completely unsupported by previous statements.]

I believe that all clearable transactions should be required to be brought to a  clearinghouse, regardless of what type of entity is on either side of the trade.   This  would remove the greatest amount of interconnectedness from the large financial  institutions. [Again, no support for this assertion.  Moreover, no explanation as to how a clearinghouse that by its very nature connects large financial intermediaries and their customers “remove[s] the greatest amount of interconnectedness”–WTF that means.]

In Energy MetroDesk John Sodergreen reports that a lobbyist says that Gensler is a rock star in Congress because “no member has a clue what they’re actually proposing in the financial reform area–and Gensler does.”

I judge the first part of this statement (e.g., no Congressional clue) to be true, with near metaphysical certainty.  As the foregoing deconstruction of Gensler’s testimony shows, the post-dash statement is not.  Indeed, how could people who don’t have a clue know whether what somebody is telling them is BS or not?  The endorsement of the self-admittedly ignorant (see my earlier post on the NPR guys who interviewed 13 House Financial Committee members) somehow proves Gensler’s wisdom?  Huh?

[I give John S. kudos.  He managed the very difficult task of saying nice things about me and Gensler in the same edition of the same publication. 🙂  I am billed as “one of the CFTC’s most ardent critics.”  One of?  I guess I have work to do.]

It is a sad day when the main question I have is whether the best metaphor for the current financial regulation reform is the aforementioned Argument Clinic Sketch, or the “Burn the Witch” scene from Holy Grail.  That’s a very hard call.

Wallison: Timmy!’s Nose is Growing

Filed under: Derivatives,Economics,Exchanges,Financial crisis,Politics — The Professor @ 5:49 pm

In an oped in today’s WSJ, Peter Wallison notes a fundamental inconsistency in the government’s defense of the AIG bailout:

Since last September, the government’s case for bailing out AIG has rested on the notion that the company was too big to fail. If AIG hadn’t been rescued, the argument goes, its credit default swap (CDS) obligations would have caused huge losses to its counterparties—and thus provoked a financial collapse.

Last week’s news that this was not in fact the motive for AIG’s rescue has implications that go well beyond the Obama administration’s efforts to regulate CDSs and other derivatives. It’s one more example that the administration may be using the financial crisis as a pretext to extend Washington’s control of the financial sector.

The truth about the credit default swaps came out last week in a report by TARP Special Inspector General Neil Barofsky. It says that Treasury Secretary Tim Geithner, then president of the New York Federal Reserve Bank, did not believe that the financial condition of AIG’s credit default swap counterparties was “a relevant factor” in the decision to bail out the company. This contradicts the conventional assumption, never denied by the Federal Reserve or the Treasury, that AIG’s failure would have had a devastating effect.

Wallison is exactly right: Geithner cannot run around Washington, using arguendo ad AIG to justify (a) the bailouts, and (b) the need for mandatory clearing, more draconian capital requirements, and restrictions on derivatives, while at the same time saying that the financial condition of counterparties was an irrelevancy in the decision to assume AIG’s obligations.  These claims are impossible to reconcile logically.  If one is true, the other is not.  Period.

Wallison then traces out the implications of this inconsistency:

The broader question is whether the entire regulatory regime proposed by the administration, and now being pushed through Congress by Rep. Barney Frank and Sen. Chris Dodd, is based on a faulty premise. The administration has consistently used the term “large, complex and interconnected” to describe the nonbank financial institutions it wants to regulate. The prospect that the failure of one of these firms might pose a systemic risk is the foundation of the administration’s comprehensive regulatory regime for the financial industry.

Up to now, very few pundits or reporters have questioned this logic. They have apparently been satisfied with the explanation that the “interconnectedness” created by those mysterious credit default swaps was the culprit.

But the New York Fed is the regulatory body most familiar with the CDS market. If that agency did not believe AIG’s failure would have actually brought down its counterparties—and ultimately the financial system itself—it raises serious questions about the administration’s credibility, and about the need for its regulatory proposals. If “interconnections” among financial institutions are indeed the source of the financial crisis, the administration should be far more forthcoming than it has been about exactly what these interconnections are, and how exactly a broad new system of regulation and resolution would eliminate or reduce them.

Right again.  If arguendo ad AIG is, as the old phrase goes, no longer operative, then what is the logical basis for this metastasizing regulatory framework?

And make no mistake about it, AIG has been the alpha and the omega of the case for mandatory clearing, exchange trading of derivatives, and more onerous capital requirements.  It’s not just Geithner.  If anything, CFTC head Gensler has been even been more of a repeat offender, subject to the three strikes rule multiple times, as evidenced by his recent testimony before the Senate Ag Committee.   In that testimony, Gensler refers to AIG as “Exhibit A” for the case against OTC derivatives.

So . . . if even “Exhibit A” AIG’s financial condition posed no threat to its counterparties (as would be necessary to make this “not a relevant factor”), what’s the need for a draconian restructuring of the financial markets?  And if the Fed and Treasury were truly concerned of the fallout from an AIG collapse, why insist to an official investigation that this was not the case?  Is this an attempt to maintain the fiction that Goldman and the other banks were totally safe and sound even during the maelstrom of the crisis, and didn’t need the money?  If not, give me a better explanation.

Wallison is quite polite when he calls Geithner’s conflicting stories about AIG “a lack  of candor about credit default swaps.”  The Treasury Secretary has lied about AIG with probability one.  The only question is which of his statements is a lie.  I would wager that the “not a factor” statement is the dishonest one.

Hopefully Wallison is right in the second half of this judgment:

The administration seems to be using the specter of another financial crisis to bring more and more of the economy under Washington’s control.

With the help of large Democratic majorities in Congress, this train has had considerable momentum. But perhaps—with the disclosure about credit default swaps and the AIG crisis—the wheels are finally coming off.

Wallison’s call for a thoroughgoing explanation of “how exactly a broad new system of regulation and resolution would eliminate or reduce” interconnections is spot on.  In particular, as I have been saying over and over, and over and over, and . . . just how does creating a central counterparty among major financial institutions eliminate interconnections?  Indeed, by concentrating failure in a single institution connected to all other major financial institutions, and which arguably operates at an information and incentive deficit relative to the dealers in an OTC market, mandated clearing arguably increases systemic risk.

To put it bluntly.  We can’t have such an epochal change in the structure and regulation of such immense financial markets based on lies plus superficial and specious analysis.  But it is abundantly clear that that is the case, as Wallison’s piece succinctly demonstrates.

November 27, 2009

Sharp Knives, Big Stakes

Filed under: Climate Change,Politics — The Professor @ 10:30 am

Henry Kissinger once quipped “in academia, knives are so sharp because the stakes are so small.”*  This insight is largely correct.  In the absence of large pecuniary rewards, academics tend to receive remuneration disproportionately in the form of prestige and reputation.  One the positive side, these can be achieved by hard work and brilliance.  On the negative side, by manipulation, conspiracy, and backstabbing.

In the modern academy, peer review plays a major role in the establishment of reputation.  On the positive side, it can serve as an important tool in identifying errors, poor reasoning, and weak exposition.  On the negative side, it can serve as a mechanism for enforcing consensus even around dubious concepts, punishing dissent, and protecting incumbent interests.  As is the case with many social mechanisms, it tends to be conservative in nature, and hostile to true innovation.  It tends to filter out the different, with mixed consequences.  Bad work is often different, but not all that is different is bad.  Moreover, since in its operation it tends to empower the avatars of prevailing views, it can be used and manipulated to punish or otherwise quash those that challenge said views.

The Hadley CRU emails provide a window onto the often shabby operation of this process.  The Wegman report characterized the historical climate reconstruction community centered on Michael Mann as a clique.  The members work with one another and review each others’ work.  As a result of the insularity of this community, and its hostility to outsiders, the published peer reviewed work in the area all tended to reinforce the views of its members.  There were many different published papers, but because they were written or reviewed by the same people, often using the same data, these papers were not independent, in the mathematical/statistical sense.  A mere count of the number of papers gave a misleading view of the actual number of independent data points.  In statistical parlance, they gave an exaggerated view of the “size” of the tests in favor of the AGW hypothesis.

Moreover, by exerting undue influence on the peer review process, the clique suppressed dissenting views and the publication of truly independent work.  It then added insult to injury by condescendingly dismissing contrary work as not being peer reviewed, or being published in the leading journals (over which they exerted their influence).  It was a scientific Catch-22.

The Hadley CRU emails show just how right the Wegman report was.  Indeed, they demonstrate that if anything, “clique” is an understatement.  “Cabal” or “mafia” is more appropriate.  The enforcers of consensus resorted to threats, slander, and crude insult to protect their precious reputations and squelch opponents.  They reveled in the death of one gadfly and expressed the desire to inflict violence on others who had the temerity to present papers that dared suggest the Emperors had no clothes.

In brief, the climate reconstruction mafia behaved exactly as Kissinger described, wielding their sharp knives.

The problem is that, unlike the run of the mill academic disputes described by Kissinger,  in this matter the stakes are anything but small.  Indeed, they are staggeringly large.  In such matters, typical academic failings are intolerable.  There needs to be a different mechanism for evaluating and appraising such policy-relevant research than reliance on a self-selected, self-reinforcing group of interested individuals.

This will present a great challenge.  When the political consequences of science are so great, it will be very difficult to keep politics out of science.  Where to begin?  The recommendations laid out in the Wegman report (emphasized in a comment by Mike Giberson) are a good place to start, but likely additional steps are required.

Follow up (noon, 11/27/09).  The normally sensible Megan McArdle states that “Sexing up a graph is at best a misdemeanor.”  I think she means “at worst,” but regardless, in context it is clear that she means to minimize the importance of the data and methodological abuses employed to create the hockey stick and related graphics.  I think this is incredibly wrongheaded.  There is NO excuse for lying with statistics in this way.  And, as the post points out, the stakes in this issue are immense.  The “sexed up” graphs have been used repeatedly in order to advance an agenda that will have enormous economic, and perhaps environmental, consequences.  (The economic costs are indisputable.  The environmental consequences are far less certain, because (a) as the debate shows, the scientific basis for AGW is on shakier ground now, and (b) even if AGW is correct, the effects of the measures currently contemplated may be very small indeed.)  Given these consequences, the expected cost of any distortion of the evidence is extremely large.  What’s more, the emails provide strong evidence that the sins go far beyond tarting up an inconsequential graphic.  These sins plausibly include, based on the code extracts and comments particularly, wholescale fraud.

* I have used this line from time to time in the past.  For instance, I used it in response to a lawyer’s line of questioning about what people have said about my academic qualifications during what was at the time the largest fraudulent conveyance claim trial in US bankruptcy law history.  The judge chuckled, and said something like “that’s true of judges councils too” and suggested that the lawyer try another line of questioning.  So, you see, I’m environmentally responsible: I recycle!

November 25, 2009

True Patriots

Filed under: Politics,Russia — The Professor @ 5:14 pm

My friend Sergei Guriev is an excellent economist and scholar.  He is also a very affable fellow–a true gentleman.  He is also a very brave man, as evidenced by his scathing “J’accuse” editorial (co-authored with  Aleh Tsyvinsky of Yale) condemning the Russian government for torture in the death of Sergei Magnitsky:

After President Dmitry Medvedev’s state-of-the-nation address, we were planning to write about whether it is possible to carry out modernization in Russia without political liberalization. But last week’s tragic death of lawyer Sergei Magnitsky in a pretrial detention facility pushed all other issues aside.

We know that readers have been given exhaustive information about this incident in recent days, but we cannot refrain from writing about it because it would be absolutely pointless to discuss any other aspect of modernization without first addressing Magnitsky’s death. What difference does it make if the stock market is up or down or what is happening with interest rates and exchange rates if no value is placed on human life? Does it make sense to speak about honoring contractual agreements if one side can take the other side’s lawyer hostage? Why discuss ownership rights when owners are denied the right to life?

. . . .

You can talk all you want about trying to halt Russia’s brain drain or how to convince Russian specialists working abroad to return home. But the fate suffered by Magnitsky and former Yukos vice president Vasily Aleksanyan, a Harvard Law School graduate, sends a clear signal to all professionals: Russia can be a very hostile, if not dangerous, place to work. Will current or future Russian students enrolled in top foreign university and graduate programs want to return to this country after graduating? And yet this is the talent base that Medvedev wants to tap to carry out his modernization program. The parallels with the 1930s are uncanny: The Soviet Union also invited foreign specialists to help modernize its industrial base, and many of them later became victims of Stalinist repression.

Magnitsky was a citizen of Russia. We don’t know whether he voted for Medvedev in 2008, but only a few days before Magnitsky died, the president said during his state-of-the-nation address: “In the 21st century, our country once again needs to undergo comprehensive modernization. This will be our first ever experience of modernization based on democratic values and institutions.”

The president probably wanted to highlight the difference between current plans for modernization and Josef Stalin’s ruthless industrialization program. But after learning about Magnitsky’s death, it is difficult to avoid the fact that today’s Russia evokes disturbing memories of 1937.

Quite stinging, and quite true.  And a devastating call out of Dmitry Medvedev.  This article sheds a very harsh light on the yawning void between Medvedev’s crypto-liberal words and the Chekist actions of the government that he putatively leads.

I have a question for the self-styled Russophiles on this site, who constantly moan about Russophobia–including mine: do you consider Sergei and Aleh Russophobes?  Traitors to narod and nation?  On what basis?  And if they are not Russophobes, why do you consider me one?  Accidents of birth?

Or, do you, like me, consider them true patriots who are pained and outraged at what is done on a daily basis to ordinary Russians in their name, for the alleged glory of the state?

I consider it an honor to know Sergei, and wish him Godspeed.

Retro Post: The Hockey Stick is Dead

Filed under: Climate Change,Politics — The Professor @ 11:30 am

Note: I wrote the following over three years ago, but never posted it.   It seems particularly apposite to do so in light of the disclosure of the now infamous CRU document drop (or heist).   The documents now available provide even more evidence of the McKitrick and McIntyre accusation of scientific misconduct, with individuals other than Mann being complicit in said misconduct.

Even absent any outright manipulation of results–which the documents (especially the comments in the code) also strongly suggest occurred–the failure to disclose data and methods is unpardonable.   This piece includes an analysis of intellectual property rights issues as they relate to science generally, and climate science in particular.

I will probably write more later (holiday schedule permitting) on an aspect of the Wegman report that I didn’t discuss in the original post: the climate science “clique” (Wegman’s word).   The emails suggest that clique is far too polite a word.   “Cabal” or “mafia” seems closer to the mark.   But, hopefully, more on that later.

Here’s what I wrote (in China, as a matter of fact) in August 06:

The NAS recently wrote a report which made it abundantly clear that the Mann “Hockey Stick” used to bolster the case for anthropogenic global warming was fundamentally flawed, but out of politeness–or cowardice–failed to pronounce the verdict in so many words. Indeed, the summary of the NAS report and subsequent press coverage it seemed determined to avoid pointing out that Emperor Mann had no clothes.

Hence, it was left to the Wegman Report to proclaim that Mann is indeed as naked as a jaybird. Wegman and his co-authors–very reputable statisticians all–make it abundantly clear that the statistical foundations of Mann’s work are extremely defective. Indeed, Wegman et al sustain all of the McIntyre-McKitrick criticisms of Mann’s work.

From this date hence, anyone who uses the Hockey Stick to bolster any argument that the current increase in global surface temperatures is unprecedented, and is attributable to anthropogenic influences, is displaying either willful ignorance or shocking intellectual dishonesty.

A couple of points.

First, I have read a good deal of the literature on global warming, and have always been quite unimpressed with much of the statistical work in this field. Wegman recommends that future work in this area include contributions from mainstream statisticians, which seems right.

Second, and perhaps more important, even the Wegman report (by design, it appears) tippy-toes around the issue of scientific conduct–or misconduct. At the very least, McIntyre and McKitrick raised serious questions about sloppiness by Mann and his co-authors, and indeed made a plausible affirmative case for scientific misconduct. Moreover, the consistent failure of Mann–and others–to disclose fully methods, data, and code is deplorable.

It is interesting to contrast the lack of transparency in climate research–at least the proxy record research that M-M focused on–with how scientific and statistical evidence is handled in litigation. I do a good deal of work as an expert witness. In such cases, if you make a mistake, it will be discovered. Every line of code, every piece of data, every method is discoverable. The opposing side’s experts will comb over every piece of information you disclose–and you must disclose everything. Opposing counsel will grill you for seven and one-half hours (in a federal case), asking you about every aspect of your methodology and results. You better check, double check, and check again. You can’t get away with claiming your code is proprietary (as Mann has done) or playing bait-and-switch with your data (as McIntyre and McKitrick claim that Mann also did.) It’s the Full Monty.

Although many academics express disdain for expert witness work because it is supposedly tainted by the fact that it is bought and paid for by interested litigants, the adversarial process makes it very difficult for seriously flawed or biased work to survive to influence the outcome of the case. This is not to say that experts will not disagree. They will. Different experts can examine data in an honest and intellectually defensible way and arrive at different conclusions. However, the kinds of stunts Mann has pulled would never fly in litigation. The opposing counsel or the judge would make short work of that.

The stakes in the cases I have worked on have been in the tens of millions, the hundreds of millions, and in one instance, the billions of dollars. Cases of such magnitude are not exceptional. Because of the high economic stakes, each side expends considerable resources to present its own affirmative economic and statistical analysis, and to rebut that of its opponent.

As high as these stakes are, those in the climate change debate are substantially higher. Billions is spent on climate research. Moreover, policies to address climate change likely have direct and indirect costs amounting to the trillions of dollars. Scientific work like paleoclimatology or climate modeling is routinely used to justify the needs for such policies. Given these stakes, “discovery” procedures like those employed in civil litigation seem more than justified. Specifically, full disclosure of data, methods, and computer code to other researchers is warranted. This will impose some costs on researchers, but knowing the burdens in advance they can take the appropriate measures to mitigate these costs. Moreover, it is essential to note that these procedures reduce costs as well–researchers seeking to replicate or evaluate others’ studies will incur a far smaller expense if methods, data, and code are made more transparent. Moreover, greater transparency will reduce the likelihood that scientific errors will go uncorrected. When research is policy relevant, as much climatological and some economic research is, such errors can be extremely costly. We have already paid a very high price to protect Michael Mann’s vanity, and the interests of those with a stake (intellectual, reputational, financial, or political) in the hypothesis of anthropogenic global warming. The point is to get the right answer, not the politically correct answer–and to get it in a timely fashion. Transparency is essential to achieve this objective.

An analogy with intellectual property law is apposite here. The key issue is whether disclosure will reduce the amount of original research by more than it will increase the “follow on” research. My intuition suggests the answer is no inasmuch as most of this research is not commercial in nature. Under these circumstances, protecting the original research through a form of “trade secrets” barrier is inappropriate. An analysis along the lines presented in the Landis-Posner book The Economic Structure of Intellectual Property Law is apposite here. Landis-Posner note the tension inherent in protecting intellectual property. On the one hand, protection generates a stream of rents that induce production of intellectual property; absent such protection, copy-cats could merely reproduce the property at virtually zero cost without incurring the expense and effort of creating it in the first place, thereby freeriding off the efforts of the inventor and undermining his incentive to innovate. On the other hand, increasing IP protections raises the costs that others incur to produce new works. That is, one can view existing work as an input to new, follow on work; high IP protections raise the cost of this input, leading to less work building on the old.

Inasmuch as basic science–including paleo-climatalogical research–is not funded by the sale of the intellectual property, but through grants and other mechanisms, the benefits of IP protection are small. One could argue that scientists are motivated by non-pecuniary factors, such as the prestige of being the first to publish a result. At most, this justifies the protection of IP to the point of publication. Once a result has been produced and published, the benefits of protecting the IP are minimal, while the costs are appreciable. Given the cumulative nature of scientific discovery and the centrality of the replication process, past research is an essential input to future research, and raising the cost of utilizing past research imposes a burden without a corresponding benefit. I am very hard pressed indeed to provide any economic justification for Mann and others to refuse to disclose their computer code, or to make it burdensome to access their data, or to impede replication of their work by publishing only obscure–and arguably misleading–descriptions of their methodology.

Identity Theft From Hell

Filed under: Politics,Russia — The Professor @ 12:03 am

Or Russia, take your pick:

Gorbushka Market, just outside central Moscow, does a thriving trade in any electronics good you could want: mobile phones, plasma television sets, the latest DVDs, and, if you ask to see them, software peddlers will show potential clients a list of “databases”.

These consist of CDs with names such as “Ministry of Interior – Federal Road Safety Service”, “Tax Service” and “Federal Anti-Narcotics Service” and cost about $100 apiece. Each contains confidential information gathered by Russian law enforcement or government agencies: anything from arrest records, personal addresses, passport numbers, phone records or address books to bank account details, known associates, tax data and flight records are on offer.

. . . .

“American journalists must be envious of how open we are in Russia,” joked Sergei Kanev, an investigative reporter at Novaya Gazeta, an opposition newspaper, who has written articles about the information trade and occasionally uses the databases in his reporting. He says the main users of the information black market are criminals. He has covered cases of blackmail where extortionists used records of reported rapes or prostitution convictions to blackmail women.

Booting up a database of narcotics offenders, Mr Kanev says: “Look here, you have photos, addresses, phone numbers, what kind of drugs they use. It’s very common for con artists to take this database, call up a family member and tell them their son or daughter has been arrested for drugs. They pretend to be policemen and for 10,000 roubles they offer to let the kid go. The parents know their kid is a user, they think its true, and so they pay up. It’s all a trick”.

But don’t worry!  The REALLY important stuff is kept strictly secret:

And while it is possible to find many secrets using Russia’s information black market, there is still evidently a tight grip on the most sensitive information of all – foreign bank accounts of top officials, their ownership of assets and those of their relatives. Those are not to be found at the Gorbushka market.

That, in fact, is the main vulnerability of the siloviki.  Will anybody exploit it?  That would be a reset worthy of the name.

Next Page »

Powered by WordPress