“Choose a single question and get it in front of some users” – that’s what I wrote myself in my aims for this book.
Evernote recently send me an email that corresponds to that advice. The single question was:
“How would you rate the support you received?”
And it’s made me revise my advice. Being a single question isn’t quite good enough. It’s got to be a good question – one that users:
- can understand,
- have an answer for,
- want to answer, and
- can provide their chosen answer.
My problem with Evernote’s question? Let’s run through those steps:
- Yes, I could understand it.
- Yes, I had an answer for them: “not bad, but I’m grumpy because a feature that I paid you for turns out not to exist and it took a series of emails to discover that”.
- Yes, I wanted to tell them that.
- But no: the email failed at the final step, a place for me to provide my answer.
It offered me two choices: “Bad, I’m unsatisfied” and “Good, I’m satisfied”. Neither works. The support agent wasn’t good: it took several emails for him to understand that I didn’t understand the lack of the feature. But he wasn’t bad either: we did get there in the end. And I’m both unsatisfied with the lack of the feature, and satisfied with other aspects of Evernote.
(Although I’m not exactly thrilled that they have addressed me by my name in lower case with a 3-digit number of their choice added to the end. That seems disrespectful).
So sorry, Evernote, no click from me. But thanks for the opportunity to think through one aspect of my book.
Where does a form end and a survey begin? That was my challenge when I was invited to lead a half-day workshop on forms design at the General Online Research conference 2014 in Cologne, Germany.
The group included survey methodologists from government, health topics, market research, and independent consultants.
We had a lively discussion as we shared examples of:
- forms that introduce surveys,
- aspects of questionnaires that are similar to forms, and
- forms that survey methodologists use to keep track of survey data.
[iframe src=”http://www.slideshare.net/slideshow/embed_code/31951467?rel=0″ width=”427″ height=”356″ frameborder=”0″ marginwidth=”0″ marginheight=”0″ scrolling=”no” style=”border:1px solid #CCC; border-width:1px 1px 0; margin-bottom:5px; max-width: 100%;” allowfullscreen> </iframe]
Are you a bit sceptical about surveys?
I was too – but researching this book has changed my mind.
If a survey is going to happen anyway, we need to make sure it’s a good one. So the first part of this talk has tips for better questions.
If you’re doing a survey from start to finish, the second part of this talk has tips for a better survey process.
What might turn an honest, happy respondent into a despondent cheat?
Thanks to everyone who attended my Virtual Seminar “Designing effective surveys” on 28th February 2012.
If you missed it, you can:
- purchase the recording from User Interface Engineering
- view the slides on SlideShare (or at the end of this post)
Resources that I mention in the seminar
A couple of days ago, I mentioned that I’d finally worked out a survey process: how to do a survey in 6 steps. It’s been evolving in a series of workshops, and the most recent one was at the J.Boye Conference 2011 in Aarhus, Denmark.
Many of us will be doing a lot of gift-giving this month, so I’ve chosen a book that would be just right for someone who enjoys something thought-provoking and a bit out of the usual:
Through the language glass: why the world looks different in other languages by Guy Deutscher (2010; paperback 2011)
One for fun – or to provoke some thought
My recent picks have been worthwhile, but not the easiest to read. This one is more fun, but it also got me thinking.
Question: What’s the difference between a questionnaire and a survey?
Answer: A questionnaire is a series of questions and answers on a topic; a survey is the overall process of obtaining useful information using a questionnaire.
Question: OK then, what are the steps in the process?
Answer: Until recently, I was stumped on that one, but after a lot of help from others and some thinking, I have an answer for you…
Which is better: an open question or a closed one? Should you include a “don’t know” option in your closed questions? Is there a “right” order for asking questions?
If topics like these concern you, then you’ll want to read my choice for this month:
Questions and Answers in Attitude Surveys: Experiments on Question Form, Wording, and Context by Howard Schuman and Stanley Presser. (1996, reprinted in 1981)
It’s always fascinating to encounter a profession with overlapping interests to our own in UX. The one I’ve been learning this year is survey methodology, and was delighted to find out that they’re really into UX as well.
The short version of this post: I joined forces with Emily Geisen, a survey methodologist to teach a workshop on usability testing at the SAPOR conference. The slides are here: Introduction to usability testing for survey research.
The longer version? Keep reading, for:
- My visits to survey methodology conferences
- Survey methodologists run pilot tests
- Survey methodologists do cognitive interviewing
- Introduction to usability testing at SAPOR