Category: Web Panels
-
Online samples: Paying attention to the important stuff
Those of you who routinely prowl the MRX blogosphere may have noticed a recent uptick in worries about speeders, fraudulent respondents, and other undesirables in online surveys. None of this is new. These concerns first surfaced over a decade ago, and I admit to being among those working the worry beads. An awful lot has…
-
Representivity ain’t what it used to be
I am on my way back from ESOMAR APAC in Singapore where I gave a short presentation with the title, “What you need to know about online panels.” Part of the presentation was about the evolution of a set of widely-accepted QA practices that while standard in the US and much of Europe are sometimes…
-
Online Sampling Again
Last week two posts on the GreenBook Blog, one by Scott Weinberg and a response by Ron Sellers, bemoaned the quality of online research and especially its sampling. And who can blame them? All of us, including me, have been known to go a little Howard Beale on this issue from time to time. We…
-
AAPOR gets it wrong
Unless you’ve been on vacation the last couple of weeks chances are that you have heard that The New York Times CBS News have begun using the YouGov online panel in the models they use to forecast US election results, part of a change in their longstanding policy of using only data from probability-based samples…
-
A bad survey or no survey at all?
For a whole lot of reasons that I won’t go into online privacy suddenly is front and center, not just in the research industry, but in the popular press as well. The central message is that people are “concerned,” but about what exactly and by how much, well the answers there are all over the…
-
Pleeezz!
Today’s update from Research-live.com has this headline: Online trackers not optimised for mobile could 'compromise data quality.' It goes on to explain: GMI, which manages more than 1,000 tracking studies, claims that online trackers that haven’t been optimised for mobile platforms may exclude this growing audience, which could lead to a drop in data quality,…
-
Pew takes a serious look at Google Consumer Surveys
The room is full here at AAPOR and mostly I suspect to hear a presentation of Pew's comparison of the results from a dual frame (landline plus cell) telephone survey and Google Consumer Surveys. There is no shortage of people I've talked to here and elsewhere who think that Pew was overly kind in characterizing…
-
AAPOR gets serious about online sampling
I am at the AAPOR annual conference in Boston. My first observation: it is huge. For example, at 8:00 this morning there are no fewer than eight separate sessions, each with five to six presenters. There is no way you can come close to covering the whole thing. So I have tentatively chosen to focus…
-
Accuracy of US election polls
Nate Silver does a nice job this morning of summarizing the accuracy of and bias in the 2012 results of the 23 most prolific polling firms. I’ve copied his table below. Before we look at it we need to remember that there is more involved in these numbers than different sampling methods. The target population…
-
The latest online panel dust up
Gregg Peterson's post earlier this week on this blog about the Panel of Panelists at the CASRO Online Conference created quite a stir. I saw an unusual number of pageviews, there was a fair amount of retweeting of the link and other industry commentators worked a similar theme. It came on the heels of Ron…