February, 2012 – This article discusses the dirty little secrets of online panel research. ( https://www.gplus.com/Consumer-Services/Insight/New-study-raises-serious-questions-about-surveys )

By Ron Kurtz

Most consumer research, including luxury consumer research, is based on online surveys taken by people who have “volunteered” to frequently take lengthy questionnaires. If you buy panel research about luxury and affluent consumers, the new report “More Dirty Little Secrets of Online Panel Research” is a must read that will cause you to question the validity of the research since it shows how poorly panel members are often treated.

The author of the report, Ron Sellers of Grey Matter Research, notes of panel members that “Paying them pennies, giving them boring, lengthy, or irrelevant surveys, frustrating them with multiple closed studies, and bombarding them with opportunity after opportunity is most definitely not how you want to treat people upon whom you are depending for your success. And if you or your research vendors are not paying attention, this is exactly what may be happening in your research.”

While the report provides a review of general consumer panels, these same panels are the sources of respondents for surveys of affluent and luxury consumers. It is hard to imagine how such people can be representative of luxury and affluent consumers.

Copies of the report can be downloaded at: More Dirty Little Secrets of Online Panel Research.

The new report is a sequel to a 2009 report by Grey Matter Research, which ran an internal test on a few panels they had used or were considering as vendors. They arranged for a group of people to sign up for each panel and record their experiences as typical respondents for a month.

The test included 12 major panels, and the results were published in the report Dirty Little Secrets of Online Panel Research. A few panel mergers, plus requests about panels that weren’t included the first time, led to the new report. It evaluates Toluna, e-Rewards, Clear Voice, Surveyhead, Opinion Outpost, MySurvey.and six more from the perspective of the typical panel member.

Some of the problem scenarios identified were:

“Yours is the tenth questionnaire in a row that the respondent has completed that morning, and many of the others were long, boring, and irrelevant. The respondent is tired and inattentive.”

The respondent may have attempted 12 different questionnaires before trying yours. One of them asked ten minutes’ worth of questions before telling the respondent they weren’t qualified and tossed them out with no reward.

Another survey froze when the respondent was mostly done. Another one told the respondent that they were not qualified and kicked them out before they could answer a single question. Two more were called “surveys” but were actually trying to get the respondent to compare car insurance rates.

Five of them were already closed by the time the respondent tried to respond, even though the invitations were all sent within 24 hours. The respondent was disqualified for two more because they didn’t own a pet, even though they stated in their panel profile that they have no pets, and therefore shouldn’t have been invited to take the survey in the first place.

The respondent is tired, frustrated, and annoyed, and now they are evaluating a new product concept that you really hope they will like. Just how reliable is your data?

Bottom line: If you buy syndicated research, you should ask if it is based on a panel survey and if so what company provided the panel. If you are doing your own survey, you should be careful in how you select a panel provider. Or perhaps you should consider a different survey methodology.