Separating cheap talk from truly held beliefs

No Gravatar

Plight of the Fortune Tellers: Why We Need to Manage Financial Risk Differently

&#8212-

In his book, Plight of the Fortune Tellers, Riccardo Rebonato describes how an invitation to bet can be used to separate cheap talk from truly held beliefs (and, in the process, ruin an otherwise engaging dinner conversation).

In the early to mid 1990s in the United Kingdom and in other European countries a widespread fear developed that a variant form of CJD might spread to humans. CJD is a fatal illness—also know as “mad cow disease”—that is well-known to affect bovines. The variant form was thought to have contaminated human beings via the ingestion of beef from cattle affected by the disease. … When the first human cases appeared scientists did not know whether they were observing the tip of an iceberg or whether the relatively few observed cases, tragic as they were, constituted a rather limited and circumscribed occurrence. “Expert scientists” were soon willing to go on record with statements to the effect that “it could not be excluded” that a catastrophe was unfolding. The nonscientific press was all too eager to jump on the bandwagon, and extravagant claims were soon presented, such as that hundreds of thousands, or perhaps even millions, of lives could be lost over the next decade. Specific probabilities were not stated, but the prominence of the reporting only made sense if the possibility of this catastrophic event was nonnegligible: the newspapers, at least judging by the inches of column space devoted to the topic, were not talking about a risk as remote as being hit by a meteorite.

As the months went by … the number of cases did not significantly increase…. Looking at the data available at the time with a statistical eye, I was becoming increasingly convinced that the magnitude of the potential effect was being greatly exaggerated. At just the same time, a well-educated, but nonscientist, friend of mine (a university lecturer) was visiting London and we decided to meet for dinner. As the conversation moved from one topic to another, he expressed a strong belief, formed by reading the nonscientific press, that the spread of CJD would be a major catastrophe for the U.K. population in the next five to ten years. He was convinced, he claimed, that “hundreds of thousands of people” would succumb to the disease. … I challenged him to enter a bet, to be settled in ten years’ time, that the number of occurrences would not be consistent with a major epidemic. My friend refused to take me up on my offer, despite my very attractive odds (attractive, that is, given his stated subjective probabilities). He claimed that “one does not bet on these things”- that he found my proposal distasteful- that, anyhow, he was not a betting man- and so on. I explained that I was not trying to gain material advantage from a possible human disaster, but I was simply probing the strength of his convictions on the matter. Ultimately, the bet was not entered, and the evening was rather spoiled by my proposal.

Julian Simon’s bet with Paul Erhlich is perhaps the most famous example of the use of a bet to test the strength of convictions. Robin Hanson has done a substantial amount of work on the foundations of such &#8220-Idea Futures&#8221- mechanisms. A similar concept underlies Long Bets and the Simon Exchange.

At Long Bets they say, “Long Bets is about taking personal responsibility for ideas and opinions.” That is the basic idea I had in mine when I suggested that “it would be a real public service to run well-conceived prediction markets based on the grandiose political pronouncements of the ‘chattering classes’.” It is all about an author taking personal responsibility for the opinions he publishes by, in effect via the prediction market, offering to fund countering opinions on well-defined claims if and only if those countering opinions turn out to be true.

(See also Chris Masse’s post. I’m not claiming any originality on my part here, I’m just trying to nudge the idea closer to common practice by suggesting a potentially interesting and fruitful area of application.)

Naomi Klein? Ann Coulter? Pat Buchanan? Michael Moore? Maybe they believe what they write, and would be willing to subsidize a prediction market out of their book royalties to demonstrate the strength of their convictions. Or how about the books from the current crop of U.S. presidential candidates—I wonder if these books contain any claims that are specific and substantive enough to be either true or false.

If such punditry-based prediction markets were common, mistaken-but-honest demagogues (those pundits who actually believe what they write, and are willing to stand behind it) would end up subsidizing more thoughtful analysts participating in the markets- correct honest demagogues would end up taking home larger financial rewards- and dishonest demagogues would dissemble, seek to avoid being pinned down on specific claims, and when pressed for actionable claims they would run and hide.

[Cross posted at Knowledge Problem.]