What futurists can learn from superforecasters

Superforecasting, the art of prediction by Tetlock and Gardner, is definitely the number one must-read for futurists this winter. Especially because the topic of prediction can generate a lot of discussion. Despite the public belief of the future industry as a crystal ball business, futurists in general are reluctant about making predictions. Instead, they help their customers to think about the future in a systematic way and to nurture their curiosity for the future in order to identify business opportunities. Nevertheless, futurists are always eager to learn a thing or two. And the art of superforecasting could be a powerful tool in the futurist toolkit.

Lack of rigor in the forecasting domain

Predictions about the future often come from media pundits and dressed-to-impress keynote speakers to pump up an audience. Or they come from people with a political agenda, scientists and activists alike, who want to galvanize action. These predictions tend to be very imprecise and hardly anyone ever measures their success. Tetlock points at the scientific approach, which is all about healthy scepticism, caution and nuance. Scientific statements should always be read as hypotheses to be tested in the search for better understanding of a specific issue. Tetlock’s Good Judgement Project (GJP) aims to bring this rigor to the forecasting domain. Briefly summarized, thousands of amateur forecasters got a basic training and tons of forecasting questions. Their scores outperform the Wisdom of the Crowd by considerable margins. The best forecasters are selected as superforecasters, some of them operate in teams and have even better scores. The GJP took part of IARPA forecasting tournament in which teams of top forecasters and intelligence professionals compete against each other. IARPA held this competition to learn more about good forecasting to serve the intelligence community. After two years, the GJP has outperformed all the other teams, so IARPA dropped the other teams and focused on learning from GJP.

“It’s not about who they are, but what they do”

Throughout the book, Tetlock gives the reader a personal view on who these superforecasters are and what they do. Overall, they are intelligent people from all sorts of backgrounds who like numbers, show curiosity and enjoy digging into a subject. But most importantly, they work with a particular method. They deploy a cautious and investigative approach, continuously weighing new information including counter-evidence, as they looking for alternative points of view that could prove their initial hypothesis wrong. Contrary to the forecasting pundits, they are humble about their abilities. They are conscious of how their minds can trick them, the so called cognitive biases that also futurists fall prey to. They balance outside/inside perspectives, under-/overconfidence, and try find the errors behind their mistakes.

Dragonfly forecasting

The large multifaceted compound eyes of the dragonfly can process colour even better than the human eye. For Tetlock, the vision of these beautiful creatures represents the mentally demanding weighing of different perspectives to come up with an ever so subtle hypothesis and a precise prediction. Working together in a group could enhance the dragonfly eye work and elevate the wisdom of the crowd. While team decision making runs the risk of groupthink, superforecasters by their cautious, reflective mind-set are able to counteract this threat and even perform significantly better as a group.

Lessons for futurists

The working style of superforecasters could be valuable for many kind of professions and especially for those who are involved with the future. Similar to superforecasters, futurists deal with complexity and uncertainty. They are as eager to feed themselves with new information, new insights and new perspectives. How can superforecasting as an approach enrich the profession of strategic foresight, trend research and all the other futurist branches?

  • Rigor in the forecasting domain provides clarity about the status of a prediction and its confidence interval. Forecasting tournaments yield thousands of predictions that could be used in horizon scanning projects and other foresight endeavors.
  • The art of superforecasting helps to structure our thinking processes about future events. It helps to identify questions for which forecasting pays off and to reformulate the questions to improve the quality of our predictions. The rethinking of questions also helps to see the smaller questions that give insight in the bigger question.
  • It helps to structure our thinking about likelihood of second, third and fourth order effects of policy decisions. While this is key in strategic foresight, leadership in public policy and private sector alike does also benefit greatly from this thinking style.
  • Forecasting tournaments and other crowd forecasting projects would yield such a thing as “evidence based forecasting”. All the clients of forecasters, the Intelligence Community and every organization who hires forecasters, they can now set the bar and ask for forecasting track records.

The Good Judgement Project is truly something very unique. Its results are useful beyond the Intelligence Community that Tetlock has worked closely with. It allows thousands of people to join the crowd and become a forecaster themselves. It brings rigor to the forecasting industry. And it provides thousands of forecasts that can help people think about the future and these forecasts could be used in strategic foresight projects. But most importantly, it has revealed the thinking style and behavioural patterns that characterises a good researcher of the future.