Part Four – Online Surveys Must Cast a Wider Net and Pay Incentives
When we look outside the radio industry for other businesses using online consumer surveys to guide strategy, there is no shortage of confidence in the online survey approach. Wherever we looked - Food, Cosmetics, Travel, Clothing, Pharmaceuticals, Political – it seems everyone else is doing online research and is supremely confident in the results. So what are they doing that Radio is not? Two answers stand out. Broader panel recruitment and paid incentives.
In the past few weeks we have looked at many examples of online research conducted for other industries targeting consumers and professionals. Several incorporate intricate multi-media in their surveys. A large supermarket chain uses virtual store shelves and a shopping cart. Coors Light showed a storyboard for a potential TV spot. Levi’s used photo images to test preferences for pocket designs and preferred fading of blue jean colors. Southwest Airlines tracked brand association to different traveler benefits. GlaxoSmithKline probed doctors on heart disease. If you think it’s hard to find radio listeners to do a music test, imagine how hard is it to get high paid, busy cardiologists to sit through an hour long pharmaceutical survey. But these surveys are all done online and the confidence level in the results at these major corporations is extremely high.
In a nationwide telephone callout survey conducted by Kelly Music Research, of 362 music radio P1’s aged 18 to 54, only 9% of callout respondents report being regular participants in online Radio surveys. However, 46% report participating in online surveys for other industries. When we asked our telephone callout panel if they would participate in online Radio surveys that pay an incentive, the number who said “Yes” jumped to 72%.
To get more qualified respondents and reliable data from online research, we need to cast a wider net. Broader recruitment strategies for online surveys, such as telephone outreach are necessary. Incentivizing panelists with a paid premium will further increase online research response rates. Unfortunately, offering “a prize” that a participant “might win” is not enough incentive for most people.
As radio stations populated their e-mail databases, many hoped it would lead to a cheap way of getting reliable research. Cheap yes, reliable no. Most radio station databases were built around contests and promotions. As mentioned earlier in our study, radio station databases attract a disproportionate number of Extreme Fans and Contest Players. Many in station databases are not in the defined target age groups and don’t even listen to the station.
Our study shows that a very small percentage of listeners in station databases clear the screening criteria for a standard radio music survey. We screened several client stations’ listener databases for potential inclusion in a music test. On average, about 1% of database members made it through a standard music test screener. These website database members were offered a cash premium to participate in an independent online music survey. No prizes or sweepstakes were offered. Overall, about 3% of all database members expressed interest. However, when screened for age, gender, ethnicity, and music preference, the number dropped to 1%. And after filtering the list for client station P1’s and P2’s, the qualified percentage dropped even lower.
In major markets and small markets, we’ve seen stations offer database members big prizes and substantial incentives to participate in music surveys. Some of these have attracted several hundred respondents. However when the stations drilled the data down to their actual target demographic, the sample became too small to be reliable.
We believe music testing can include radio station database members, but not rely on them exclusively. Taking a cue from other industries, online Radio research needs to be more inclusive by casting a wider net. And we need to reward participants for sharing their honest opinions with us.