U.S. Supreme Court Prediction Market – [PAPER]

Recently posted to SSRN: FantasySCOTUS: Crowdsourcing a Prediction Market for the Supreme Court, a draft paper by Josh Blackman, Adam Aft, &amp- Corey Carpenter assessing the accuracy of the Harlan Institute&#8217-s U.S. Supreme Court prediction market, FantasySCOTUS.org. The paper compares and contrasts the accuracy of FantasySCOTUS, which relied on a &#8220-wisdom of the crowd&#8221- approach, with the Supreme Court Forecasting Project, which relied on a computer model of Supreme Court decision making. From the paper&#8217-s abstract:

During the October 2009 Supreme Court term, the 5,000 members made over 11,000 predictions for all 81 cases decided. Based on this data, FantasySCOTUS accurately predicted a majority of the cases, and the top-ranked experts predicted over 75% of the cases correctly. With this combined knowledge, we can now have a method to determine with a degree of certainty how the Justices will decide cases before they do. . . . During the October 2002 Term, the [FantasySCOTUS] Project’s model predicted 75% of the cases correctly, which was more accurate than the [Supreme Court] Forecasting Project’s experts, who only predicted 59.1% of the cases correctly. The FantasySCOTUS experts predicted 64.7% of the cases correctly, surpassing the Forecasting Project’s Experts, though the difference was not statistically significant. The Gold, Silver, and Bronze medalists in FantasySCOTUS scored staggering accuracy rates of 80%, 75% and 72% respectively (an average of 75.7%). The FantasySCOTUS top three experts not only outperformed the Forecasting Project’s experts, but they also slightly outperformed the Project’s model &#8211- 75.7% compared with 75%.

You can download a copy of the draft paper here.

[Crossposted at Agoraphilia, Midas Oracle, and MoneyLaw.]

Leave a Reply

Your email address will not be published. Required fields are marked *