Crowdsourced Forecasting & Why it's the Best Forecasting Tool Around

By Ben Roesch on August 16, 2015


Francis Galton was an English scientist who believed that the average human was quite stupid. Galton believed that, to have a healthy society, you needed to concentrate power in the select few who didn't fit that bill.

In 1906, Galton attended the West of England Fat Stock and Poultry Exhibition. While he was likely years too early to enjoy some fried ice cream at the fair (sorry Francis, your loss), he did stumble upon an opportunity to prove his theory regarding human intelligence. Galton noticed attendees placing bets on the weight of a prized ox and he managed to get his hands on the 787 betting slips that were submitted. Surely, this crowd of country bumpkins would be wildly off the mark on their guesses.

The weight of the ox was 1,198 pounds. Imagine Galton's shock when he tallied the guesses: an average guess of 1,197 pounds.

While this result was bad news for Francis' theory, it illustrates a simple, yet powerful concept. The forecasts of an individual may be horribly incorrect or they might be right on the dot, but the variability of individual performance makes it hard to know which individual to trust (think money managers). But if you aggregate the predictions of a crowd of people, you're much more likely to come up with a very strong forecast (think index funds).

Ok, let's assume I believe you. Why does this work?

To predict or forecast something with consistent accuracy, we want information that has a couple of key attributes:

1. Bias-free

Let's say you're an executive overseeing a large software project. You have a number of team leads that report to you: functional, technical, QA/test. Every Friday, each team lead is responsible for sending you a red/yellow/green status report for their portion of the project.

Every Friday, mostly green status reports roll in, until you reach a late stage of the project. Out of the blue, the technical lead comes to you and tells you that there are major issues and that the team won't hit a major deadline. You wonder why in the world you're just hearing about this now. What was the deal with all those green's leading up to this?

The problem is that each status report is inherently biased. No one wants to be the person who stands up and says "the portion of the project I'm responsible for is totally FUBAR" and risk being replaced. Instead, the incentive is to send in a green status report and try to work the issue out internally.

To make good decisions, this executive needs information that accurately depicts the state of the project. To make good decisions, we need information that is unbiased.

2. Diverse

Prior to the Bay of Pigs invasion, the Kennedy administration planned and executed the invasion without consulting many people and agencies who may have been skeptical of the plan. The intelligence branch of the CIA was not consulted, nor was the Cuban desk of the State Department. Instead, the people who planned the invasion were the same people responsible for judging the likelihood of its success. The result was an unmitigated disaster.

By avoiding or ignoring potential dissenting opinions, the administration fell into a groupthink where everyone assumed the plan was sound. The entire invasion assumed 1,200 men could take over all of Cuba, but no one spoke up regarding the ridiculousness of this assumption.

When groupthink like this occurs, people walk away from each discussion with their ideas reinforced, rather than challenged. Dissent and diversity of opinion is a healthy, necessary part of the decision making process because it forces us to consider and plan for all angles.

3. Complete

Imagine you're a purveyor of adult beverages and you're looking to introduce a new brew to juice your summer sales. Should you come out with a new cider? A new light pilsner? Or maybe one of those lemonade beers? If the manager of the new products division makes the call, maybe she'll pick right, maybe not.

Now imagine that this company has an army of sales & distributor reps, each of whom is talking daily with potential customers. Reps selling to bar owners hear that they can't get enough cider taps these days. Other reps hear that customers of their distributors have been growing orders of ciders as well. And still others hear that sales fruity beers are trending down.

It's impossible for a single individual to have all of these data points. But if we could aggregate all of this knowledge into a "complete" picture about which beer will sell best, it would be very valuable to the manager making the call.

Sounds pretty great. How do I actually harness this concept in a business?

Crowdsourced forecasting platforms (including prediction markets) are online tools (e.g. Cultivate Forecasts) where a group of users can input predictions about questions, events, or metrics. By harnessing a large group of people, it ensures that the resulting forecast data will be both "complete" and diverse. Predictions can also be made anonymously, which helps root out bias caused by misaligned incentives or fear of reprisal. By satisfying these three requirements, a prediction market can produce much more accurate forecasts regarding key business metrics or events.

Prediction markets are already being used by numerous businesses for a variety of use cases. For example, Ford has used prediction markets to forecast vehicle sales. When retrospectively compared to previous forecasting methods, Ford found the prediction market to be much more accurate, see more about their economic studies. Based on these improved forecasts, Ford could then better plan a variety of activities, including supply chain, inventory management, and marketing spend.

If you're interested in hearing more about crowdsourced forecasting and prediction markets, you can contact the author of this post at [email protected]. /* */ 

If you'd like to try your hand in prediction markets, you can also participate in a Public Market.

prediction markets enterprise crowdsourcing crowdsourced forecasting