Tell us how you may be wrong

By Adam Siegel on May 17, 2022

The workhorse rationale box asking our forecasters: "Why are you forecasting this way?" has been a staple - and unsung hero - of our platform for years. More than anything else, consumers of our forecasts want to understand why the numbers are what they are. Why is the consensus going up? Why did it go down? Why is this person such a contrarian? What are people citing as sources for their assessments? 

As consumers learn to use these forecasts as part of their own analysis and decision-making, we've been thinking through how we can make sure they see a complete picture - not just the one represented by the graph visualization that tracks the consensus, but why that consensus may be wrong. We want the consumer to always question their assumptions and question the consensus. 

A couple weeks ago we launched a new sorting mechanism for forecasts: the "contrarian sort" which brought forecasts and rationales that were the most opposite of the current consensus to the forefront. 

Soon we'll be launching an enhancement to the rationale input itself: the "pre-mortem" input. This will be an optional text area for the forecaster to say why they think their forecast could be wrong. 


Testing it myself over the past couple weeks internally has already forced me to think differently about my forecasts. The simple act of openly articulating why my current assessment could be wrong has made me moderate a more extreme assessment I may have otherwise made. Then as time goes on I can check my older pre-mortem to see if any of those have come to pass or not, and update my forecast appropriately. 

It's helpful too of course as a collaborative exercise for everyone to see each other's pre-mortem too in the activity feeds.


We don't think this is burdening forecasters too much, and instead will both present an entirely new source of insight for consumers and a check on what may otherwise be an overeager forecaster!    

crowdsourced forecasting