What are “flash crowds” and how do we recruit them?

By Vanessa Pineda on November 01, 2018

Our bread and butter working with clients is organizing their employees to participate in crowdsourcing exercises. Recently we have been approached more to help get forecasts from external crowds, either to support research projects, or to better understand what outside experts or customers think.

A traditional approach for finding external people to participate in a study is to simply pay for them, either through a research sourcing firm like ResearchNow, or through a crowdsourcing task aggregator like Amazon’s Mechanical Turk.

While this approach is good for market research style “one off” exercises, what if you want rich, real-time insights from your crowd for a few weeks or months? You probably need what we’ve been fondly calling a “flash crowd.” A group of people that comes together, does what you ask of them (e.g., a specific business challenge), then disperses after a period of time.

Given others may be trying to build these kinds of flash crowds for one reason or another, we wanted to share 5 key lessons we learned for online recruitment efforts.

Specifically, we’ve been building our crowds using two techniques:

  • Direct email campaign to identified networks (e.g., university groups or professional organizations), and
  • Social media marketing (e.g., buying ads on Facebook).

Example 1: LG Display’s direct email marketing to build a flash crowd for a 6 month forecasting tournament

LG Display wanted to get a flash crowd to predict future industry performance, including TV sales and product cost trends, in advance of key marketing decisions. We agreed to run a 6-month forecasting tournament, where we would recruit and engage a crowd of at least 100 technology savvy enthusiasts.

Our plan was to reach out to different networks through a direct email campaign. We wanted people with some technology industry experience, but immediately felt that targeting professional societies would be a non-starter. We had to offer a real benefit for the specific audience, and given our budget for incentives and time commitments, we understood it would be a hard sell to professionals.

First lesson: Identify a specific audience with whom you can create a mutual value proposition.

Here’s what we could offer, and therefore, needed people for whom this would resonate: Some monetary incentive each month or chance for a grand prize, and the opportunity to win a forecasting competition in their field that leveraged the LG brand.

This made us look to undergraduate and graduate student groups at universities. Specifically engineering, technology, or business groups within professional programs (e.g., The Chinese Business Group at Kellogg, the Engineering Group at University of Wisconsin). We compiled a list of about 100 names of group leaders with some internet research of existing student groups at top universities.

Second lesson: Create an incentives structure upfront to attract people to your effort, but be flexible with ad-hoc rewards to encourage behaviors along the way.

The incentive structure we created and announced for the tournament included three tiered grand prizes of up to $1,500 for overall accuracy and participation, as well as monthly rewards of $50-$200 for accuracy. The idea was that the grand prize would help to generate buzz and attract participants, while the monthly rewards provided smaller, more achievable milestones to keep people coming back.

Reward psychology tells us that the greater the distance for a prize, the more it loses its desirability and people’s expectation to qualify -- so it was no surprise that after the initial excitement of launch, the appeal of the grand prize lost some of its luster. Participation significantly declined after a month, which we expected, but we’d hoped the monthly rewards would be smaller carrots to bring people back. However, we learned that in rewarding only accuracy each month, it was actually dis-incentivizing people who didn’t rank well in the first few questions that got judged. And really, what we wanted was more participation from everyone to generate greater accuracy together (not individually).

Luckily, we’d left budget for ad-hoc rewards, so we introduced new categories, including most useful comments and most active forecaster. We announced rewards in a newsletter, which had a 50% open rate among participants and increased participation on the tournament site by 30% each time it was released.

Sometimes, we had questions on the site that got little participation because they may have been harder to predict. To promote specific content, we sent out “flash drawings” (we like the term “flash”) by email - where people who made a forecast in a given question during an announced period of time were entered into a lottery for a gift card reward. Forecasts doubled (at the very least) for each “flash drawing.”

Third lesson: Reinforce the value proposition. Put in the work for good design and branding of your flash crowd experience.

Prizes aren’t enough to draw people to your effort. A flash crowd challenge of any kind should have a compelling value proposition you showcase somewhere and continue to reinforce. People like to feel they’re part of a mission or a group with similar interests. That means you want to create a brand identity and tell your story.

This is typically done in the form of a project microsite or landing page that speaks to your audience and has a clear call to action (registration/sign in). For the LG tournament, we designed a landing page that matched LG’s brand identity, and outlined the value of the challenge for participants and to the company. It was where we directed people in our email outreach, and where they could refer other people.

Final flash crowd stats: Within six weeks prior to the tournament, dedicating only a few hours per week, we recruited 250+ people (exceeding our goal of 100).

Example 2: U.S. Government flash crowd for a 9 month forecasting experiment

Another client of ours within the U.S. Government wanted to test buying ads online to recruit participants for a 9-month crowdsourced forecasting study.

The study had already run a first round and used another company to execute the initial recruitment campaign. However, the campaign had been costly and there was high attrition over the duration of the study, so the client wanted to run an experiment to see if a high “cost per acquisition” was inevitable, or if it could be improved to lower recruiting costs.

Our challenge: we had two weeks to buy ads and recruit a greater number of engaged participants (~600) for less money.

Fourth lesson: Avoid large, mass marketing when you can highly target people in more cost effective ways.

People often assume that if you can spend money on high visibility ad channels, that’s the way to go. When we analyzed performance from the last marketing campaign, it was clear that the recruitment strategy had been to buy expensive ad space on prominent podcasts and pour six figures into Facebook ads to appeal to as many people as possible.

But with so much targeting available at our fingertips, we knew there was a better answer. Though our client was heavily biased against using Facebook again, our research showed that social media marketing (compared to Pay Per Click, display, or other online marketing channels) was the best strategy for us to both create awareness for the study and do greater audience targeting. And, of course, as the largest social platform in the world, Facebook came out superior in those capabilities compared to any other. We did work with Reddit too, but their management tools and approval process were subpar, and we ended up simply not having the time to experiment as much with the platform.

To avoid the same mistakes on Facebook from the last campaign, we knew we needed people who would be motivated by the “brand” of the agency we were working with, and who’d be eager to contribute their expertise to forecast geopolitical events. We set up distinct audience segments by profession, interests, age, and even behaviors, and tested their uniqueness with Facebook’s audience overlap tool to help prevent ad fatigue. Our main audiences were people who worked in or were retired from government, U.S. Veterans, students in political science, and “news junkies.”

Fifth lesson: Don’t just let online ads run themselves; you can only improve performance through optimization.

Coming in as a novice to Facebook, you can’t select audiences, run some ads, and hope for the best. While Facebook does some optimization for you, it’s still not enough, and results can decline fast.

We started with 3-4 ads per audience segment, so that we could quickly see what imagery, headlines, and copy were performing better. After a few days, we compared our best and worst ads, stopped running the worst, and replaced them with new ads.

One key discovery we quickly reacted to was ads that gave meaning to an audience’s skill (“we need your unique XYZ skills to help advance U.S. forecasting intelligence”) performed better across most audiences than ads with an extrinsic motivator (“you could win up to $300 for completing the study”). This informed the design and copy for subsequent ads.

With all of the ways to optimize, we had to get familiar with Facebook’s tools and diagnostics. We also focused heavily on one main metric to evaluate ad performance. For us, performance was based on cost per acquisition (also Cost Per Action). The less it cost for an ad to convert a registered study participant, the better. On individual ads, our CPA on Facebook reached as low as $2.80 and as high as $11. Our average CPA was $5, compared to an average of $9 in the prior campaign.

Final flash crowd stats: During two weeks of running ads, we recruited 1,100 participants (exceeding our goal of 600) and decreased advertising costs by 50%.

We consider these 5 lessons to be applicable across the board for any effort. And though we take a more organic approach to enable cost savings in the recruitment stage, it’s still important to consider that you need a dedicated budget for ad buys and for ongoing incentives that is commensurate with the size and time commitment of your initiative.

We’d be curious to hear what business challenge you’d pose to a flash crowd. Send us a note!

If you're interested in the forecasting application that supports these efforts, visit the Cultivate Forecasts page

Vanessa

By Vanessa Pineda

Vanessa is the Director of Professional Services at Cultivate Labs.


prediction markets crowdsourced forecasting client spotlights