I can remember when—ten years ago!—Caroline first told me that she was going to write this book.
I immediately thought it was a great idea. At the time, I’d never done any surveys myself since I was such a huge fan of qualitative UX research methods (especially usability tests), as opposed to quantita- tive methods like surveys.
But even though I’m biased (after all, I did write two books praising usability testing), I like to think that I’m not a jerk about it. I know there’s value in quantitative methods, and I always thought it would be great to be able to do a quick survey and get useful answers to specific questions, like “How many people do x?” and “How many like x better than y?” So I was delighted to think that Caroline was going to make it easy for me by writing the book I wanted to have:

But as she started describing it, I was surprised to learn that the book she intended to write (or at least the book she wanted to write) was:

I was sure she had to be kidding, and I hastened to point out that the “talking people out of doing a survey” book probably had far less sales potential than the one I was hoping for. But I soon learned she was only half kidding, because when people asked for her help doing a survey, she often ended up feeling that “Don’t do a survey” was the best advice she could give them, for two reasons:
- She’d seen far too many surveys done wrong, and more importantly
- She’d seen far too many that shouldn’t have been done at all.
Fortunately for us, she ended up writing the book that I wanted. But it took her more than ten years of very hard work. Here’s what she did:
- Read a staggering number of survey books and research papers (which we don’t have time to do).
- Absorbed and understood it all (which we probably couldn’t do even if we had the time).
- Attended lots of survey conferences and workshops, and talked to experts and non-experts to learn about their best (and worst) survey practices.
- Combined it all with her own practical experience doing surveys and helping others do them.
- Boiled it down into understandable advice (which she happens to be really good at).
- Made it into a book (which, having written books myself, I strongly advise against, since it’s a ridiculous amount of work, especially if you do it well).
Like me, I’m sure you’ve been on the receiving end of plenty of bad surveys. Personally, the thing that bothers me most is when I want to answer “Other” or “N/A,” but they don’t let me. For example, how do I answer this question about the place where I’ve gotten my morning coffee every day for years?

If I answer 1, it makes my friendly baristas look bad, and if I answer 5, it’s a lie. The people who did the survey are forcing me to give them bad data, which means they’re going to get inaccurate results, so why should I spend my time helping them? I always fill it out anyway, and answer 5 because my baristas are very nice people, and I don’t want them to lose any points. But I’m never happy doing it.
In the pages ahead, you’ll find that Caroline talks about how to deal with many, many specific issues, like having an “Other” option. But even more valuable is the clarity she brings to big-picture issues, like what kinds of questions to ask (and how to ask them), what kinds of people to ask (and how many of them), and what to do with the results you get.
Thanks to what I learned from reading several drafts of this book over the years, I’m very comfortable doing what she calls a Light Touch Survey: highly focused, just a few questions, where the results you get will help you make an important decision. In fact, I just did my latest one a month ago to answer a few questions about how people use my website, before finally redesigning it after 20 years.
So if you’ve ever wanted to do a survey—or you want to do better surveys—you’re in the right place.
Just be glad that Caroline did so much of the heavy lifting for us.
—Steve Krug author of Don’t Make Me Think