Panels and more panels in the Big Easy

Last week I spent the better part of the last two days at the CASRO Panels Conference down in New Orleans. As the name suggests, the goal was to share the latest research on research aimed at improving the quality of online research. Historically, conferences like this have tended to be dominated by panel companies shamelessly hawking their wares under the guise of research. That’s been changing slowly and the bar is being raised, but we still have a ways to go. On the one hand, I think there is a genuine effort across the industry to solve the most pressing data quality problems. Dyna Boen from MarketTools expressed it best with the catchwords “valid, unique, and engaged.” The idea is do the things we need to do so that we can be sure panelists are who they say they are, we are able to make sure they can only participate in the same survey once, and they make a serious and consistent effort to answer our questions.

There are lots of good ways to accomplish the first two of this but still considerable disagreement about the third, “engagement.” And so we heard the now standard pitches for Flash tools in surveys and broader use of color and graphical design, although mostly that’s being pushed by people for whom Flash tools are central to their business model. And mostly these folks seem to be worried about holding on to their panelists, a legitimate concern, rather than the quality of the data being collected. So while it’s admirable that we continue to focus on this, it doesn’t feel like we are making a whole lot of progress.

Another major theme was alternative strategies for building sample. We heard about river, about use of multiple panels, and use of social networks. We heard unsubstantiated claims that using multiple panels on a study “increases representativeness” and moderates bias. And I probably would have screamed if one more person described river sampling as “the new RDD.”

Arguably the best moment of the conference was watching Bill Blyth from TNS explain to us all the challenges of doing research in Europe where there are lots of countries with lower Internet penetration rates coupled with a drift away from landlines to wireless, especially when “representativeness” is taken seriously by clients. It was another reminder of the differences in how European researchers approach our work and the frankly shoddy research that we sometimes peddle here in the US. Bill’s message came down to a warning that as we continue globalize the kind of research we have been selling here in the US might be questioned by European companies with whom we would like to do business.

And so we soldier on.