Cultivate Labs Support Article

Cultivate Labs Support Articles

How to approach an initial probabilistic forecast 

To state the obvious, the future is uncertain. But there are techniques you can use to arrive at a probabilistic forecast in such a way that you don’t feel like you’re simply taking a wild guess. 

How you approach a forecast should also be explained in your rationale since rationales provide qualitative data that is just as important as the quantitative forecasts for the decision makers trying to use your forecasts. The more information you share about what led you to your probabilistic estimate, the more it helps other forecasters better think through their own estimates, which can increase collective accuracy. A thoughtful rationale also helps you avoid hindsight bias once questions resolve so you can learn from your hits and misses (“Was I right for the right reasons or the wrong reasons?”). You can almost think of it as a diary you're keeping over time about your judgments.

Below are general guidelines to help you with your probabilistic forecast and for explaining your thought process in your rationale.

1. Start with the outside view, use base rates


When making forecasts, a “base rate” refers to a prior probability of something happening. Base rates are used to evaluate similar cases to what you're trying to forecast, referred to by Kahneman and Tversky as the outside view.1 Let’s say you’re forecasting the probability that a project within your organization will launch on deadline. Most people tend to over-emphasize what they know about the project (inside view).2 For example, thoughts like “the people on the team are excellent”, “this project has a ton of resources behind it,” “leadership is supportive,” may lead you to a high probability estimate like 80%.

But, savvy forecasters will start by looking at the outside view/base rates:

  • How often do projects launch on time company-wide? Let's say 60%
  • What was our success rate last year and the year before? Maybe 50%
  • Or, consider that business data suggests that generally 70% of all project launches fail. So, just 30% succeed

In this case, looking at relevant base rates creates a more useful range (30-60% success) on which to anchor your forecast. So, you might lower your forecast from 80% down closer to about 55%. 

You can see how this can help mitigate planning fallacy. A prevalent issue in many organizations and first introduced by Kahneman and Tversky in 1979, this is a phenomenon in which predictions about how much time will be needed to complete a future task display an optimism bias and underestimate the time needed.3 Explaining the base rates you used in your rationales will help other forecasters minimize their own biases.

2. Consider the unique qualities of the case at hand 


Let's continue with the project launch example. After identifying base rates, now you can go back to evaluating the inside view, which is what you know about this project that makes it unique, for example:

  • Has it completed all other milestones leading up to launch on time?
    Yes
  • Are there other milestones before launch to look out for?
    Yes, and all signs point to meeting them
  • What needs to happen for a successful launch – is the supply chain/operations looking on track or have there been hold-ups?
    Haven't heard of any hold-ups
  • Are there competing leadership priorities that might conflict with launch?
    Yes, other big project launches happening at the same time
  • Is staffing and resourcing in place?
    Yes
  • Has the team stayed on budget/ over-budget?
    Slightly over budget, but not necessarily a negative sign for launch

If all factors are mostly positive, then you'll want to adjust your forecast back up to about a 65-70% chance of success. In your rationale, outline these unique factors you considered about the project, along with your base rates. As you see other forecasters share what they considered, you're likely to come across other factors you may have missed, and subsequently update your forecast.

3. Seek out contradictory information


Our minds like to play tricks, exhibiting common biases when forecasting. One is the tendency to favor evidence that confirms a pre-existing belief (confirmation bias) and the other is the tendency to be too optimistic (overconfidence bias).4 When you’re looking for base rates or evaluating the inside view, make sure you’re checking your gut and playing devil’s advocate with yourself. Share some of the opposing views you weighed for your forecast in your rationale, and encourage others to challenge you – “If I’m wrong, why might that be? What am I not considering?” Be open to a healthy back and forth in the comment thread.

4. Understand how confident you are in your estimate


One of the last things you’ll want to check before you finalize your forecast is: “How confident am I in my assumption?” and “How confident should I be?” Simply checking in with yourself about this will help adjust your estimate lower or higher to align with what feels right – and it’s helpful to note in your rationale. 

Many of our forecasting platforms include optional section for adding a pre-mortem (a brief explanation of why you might be wrong) alongside the rationale when submitting your forecast. This reinforces the act of seeking out contradictory information to your intuition and helps avoid extreme forecasting (i.e. 0% or 100%) when it’s not warranted. You can often uncover a new and helpful perspective by simply asking yourself “if my forecast turns out to be wrong, what factors might have contributed to that unexpected result?”

Once you've arrived at your forecast, don’t be afraid of being “wrong” on your first try. Since Cultivate's online forecasting platform is not a survey, you’re encouraged to regularly go back in and resubmit updated forecasts on a question until the close date. Research has shown that employing the techniques above and updating your forecasts in small intervals over time leads forecasters to be more accurate. 

So, practice, practice, practice - happy forecasting!


Go back to the Support page



---

Sources:

1,2 Kahneman, D. (2011). Daniel Kahneman: Beware the inside view, McKinsey Quarterly.

3 Kahneman, D., & Tversky, A. (1979). Intuitive prediction: biases and corrective procedures. TIMS Studies in Management Science, 12, 313–327.

4 Good Judgment Inc. training (2018).