The Secret Ingredients of ‘Superforecasting’
That was the inescapable conclusion drawn from the Good Judgment Project (GJP), a forecasting tournament launched by Wharton professors Philip Tetlock and Barbara Mellers. From 2011 to 2015, the US government-funded online initiative pitted the predictive powers of ordinary people against Washington, DC intelligence analysts on the most significant geopolitical questions of the day. Over successive rounds, Tetlock and Mellers identified the very best prognosticators from the 25,000-strong participant pool and shunted them into elite teams. Despite the fact that the Beltway experts had access to classified data and intelligence reports, the GJP superforecaster squads bested them in predictive accuracy by about 30 percent.
But there was more to the GJP’s success than merely identifying and grouping superforecasters. Along the way, Tetlock and Mellers developed three interventions – training, teaming and tracking – that improved prediction quality for superforecasters and average folks alike. This feature of the GJP may be the most appealing for companies, as even a modest increase in the overall accuracy of a firm’s predictions could unlock tremendous value.
Training refers specifically to probabilistic reasoning tutorials, which convey tools and techniques for testing assumptions, spotting relevant patterns in past data, avoiding common errors in judgment, etc. Teaming, as you might expect, involved grouping individuals together so they could share information and challenge each other prior to making a prediction. Tracking was the practice, mentioned above, of separating the highest performers into elite squads of superforecasters.
No comments:
Post a Comment