Now available: Design for Impact by Erin Weigel

Three reasons why your response from your panels may not be what you want


What might turn an honest, happy respondent into a despondent cheat?

I’m a dedicated survey respondent. I have lots of reasons why I tenaciously
try to respond to every survey invitation that I get:

  • I’m collecting examples for my library of screenshots
  • I’m interested in the experience of answering surveys
  • I’m opinionated and enjoy giving people my views
  • I even quite like getting rewarded for my time
  • Sometimes I’m compiling statistics about how many invitations I get, and
    how many I manage to respond to.

But today, even I began to crack. I needed a break from work and thought I’d
tackle a few of the survey invitations lurking in my email in-box. Two of them
were from a panel that I’m a member of; one was from a research institute. I’m
going to start by ranting trying to talk calmly about the panel
experience first, and then bring in the research one.

Why do you repeatedly ask me the same questions?

Here’s what I’d like to say to the owner of the panel I’m a member of.

I’m supposed to be a *member* of your panel. Membership implies some sort of belonging, doesn’t it? Shouldn’t there be some sort of continuity in that experience?

But no. Every single survey that I get from you starts by asking me a slew of the same questions. I’m simply getting tired of telling you, again and again:

  • my age
  • where I live
  • my marital status
  • my employment status
  • my household income
  • that I’m a business owner
  • the number of employees in my business
  • that I’m the main decision-maker for my business…
  • and on and on and on.
  • Why can’t you just remember? Please?

    Why don’t you let me answer, occasionally?

    One of the surveys today started with this message (I’ve lightly anonymized

    “Thank you for agreeing to take part in this evaluation of the XXXX website.
    Your feedback will help us improve it”.

    As it happens, at that point I hadn’t agreed to anything – I’d just clicked
    on a survey invitation that made no mention of what type of survey I’d be getting,
    or for whom. But we’ll ignore that problem and focus on the cheery idea that
    ‘my feedback will help to improve a website’. Great! I’m an opinionated person
    and I enjoy giving advice about how to improve websites. Actually, I enjoy that
    so much that it’s what I do for a living, what I do for my academic activities,
    and what I do for relaxation. My lovely, long-suffering husband might complain
    that I don’t do much else. I was feeling genuinely rather happy that I’d get
    a chance do to it as part of this survey.

    But no. Screened out again.

    As it happens, this particular panel has screened me out of every single survey
    invitation I’ve received this year. I’m at the point where I’m going to screen
    them out.

    So here’s another thing I’d like to say to them:

    If you don’t want my responses, why keep asking me? If you do want my responses,
    why not humor me by letting me respond, sometimes?

    Why do you pound me into cheating?

    Greg Peterson posted on The Survey Geek’s blog about a respondent panel at
    the CASRO Online Research Conference in March 2012. His post is called The end of “don’t ask
    don’t tell” in online survey research
    . That’s because some opinion-formers
    in the market research industry consider that online panels have been suspect
    for ages, full of ‘professional respondents’ who are interested only in the

    The panel was formed of not just any respondents, but actual ‘professional
    respondents’, the dedicated people who sign up for multiple panels
    because they do enough survey-answering to gather
    some actual financial reward. (After two or three years of trying to amass a
    reward from the panel I’m a member of, I have yet to gain enough points to be
    able to redeem them).

    And indeed Greg Patterson’s post starts with some stories that should make us all wince, such as respondents cheerfully admitting to taking the same survey more than once.

    Despite that, he devotes a big chunk of his post to pointing out that the respondents
    do in fact prefer to answer honestly. For example: “These folks take their
    survey taking responsibilities seriously. They did surveys at least in part
    because they really like giving their opinions”.

    I’ve observed my own survey behavior includes both approaches:

    • Honesty, trying to give my truthful opinion, mostly for the intrinsic reward
      of feeling good about it.
    • Speeding, trying to answer everything as quickly as possible without thinking
      about it, just to get to the end.

    Let’s be straight here: “speeding” is also known as “cheating”.

    And here’s a thing: speeding isn’t as much fun as you might think. It’s even
    more boring than honesty. At least with honesty, you occasionally notice that
    they’ve varied the question slightly (although many surveys are so mindlessly
    repetitive, that doesn’t happen anything like as often as I’d wish).

    on the highway often only shaves a rather small amount of time, especially over
    shorter, more interesting journeys. Speeding in surveys is the same: it doesn’t
    save you much time if the survey is shortish, and it’s actually more stressful
    than relaxing into honesty. It’s only really worthwhile if the survey is especially
    long and remarkably boring.

    What I’ve noticed in my own survey behavior is that I always start honest,
    and then sometimes resort to speeding when a survey has pounded me into that
    bad behavior by offering me a series of questions so boring, irrelevant, or repetitive that I’ve given up trying to make sense of them.

    That’s exactly what happened to me as the luckless respondent to the survey
    from the research institute. I was happy at the start because it was delivered
    as one page – a very, very long one, but at least I knew it wasn’t going to
    screen me out. Result!

    Then it got straight into real questions. No series of deeply boring demographic questions to fight through at the start. Result!

    I attacked it positively, but was ground down by a hideously complicated set
    of instructions and then a series of rating tasks that I found incomprehensible.
    I tried, but it gradually dawned on me that I’d given up and was speeding. It
    wasn’t a conscious decision to cheat – it was a behavior I’d been pounded into.

    Three reasons why our response may not be what we want it to be

    As survey designers, I think we need to look first at what we’re asking respondents
    to do before we jump to condemn them as cynical ‘professional respondents’.
    Are *we* guilty of any of these bad behaviors?

    • Asking the same things again and again, particularly from panellists who
      are supposed to be ‘members’ with a continuing experience?
    • Failing to let our respondents actually respond?
    • Pounding them into bad behavior?