Surveys That Work Blog

Posts written by Caroline Jarrett

  • SUS Svensk: System Usability Scale in Swedish

    Posted on

    Recently, I wrote about SUS: a good enough usability questionnaire, the scale devised by John Brookes.

    Bengt Göransson commented that he uses it as part of his ‘tool box’, pointing out that it is important to use it alongside other methods to capture efficiency and effectiveness. He has kindly given me permission to publish his Swedish translations here:

    System Usability Scale in Swedish:

    SUS-svensk.pdf

    Instructions for the user about how to fill in the questionnaire:

    SUS-följebrev-svensk.pdf

    Survey book of the month, March 2011

    Posted on

    Surveys are all about getting numeric data, and then reporting on it. What points do you want to make with those numbers? I’ll assume that you are aiming to be honest, respecting your data and your audience.

    One fun way of learning about the tricks that people play with data, or perhaps the mistakes that they make inadvertently, is to read my March pick for survey book of the month:

    Darrell Huff (1954) “How to lie with statistics”

    Published in the US by W. W. Norton & Company with illustrations by Irving Geis; in the UK by Penguin with illustrations by Calman.

    Huff aims to teach us “how to look a phoney statistic in the eye and face it down; and no less
    important, how to recognize sound and usable data in [the] wilderness of
    fraud”.

    (more…)

    Survey book of the month, February 2011

    Posted on

    This month, I’ve been working on my chapter on survey questions and revisiting my libary of books about questions. My favorite, easily, is:

    “The Psychology of Survey Response” by Roger Tourangeau, Lance J. Rips and Kenneth Rasinski (2000) Cambridge University Press.

    Let me take you through their main framework and some of my recent experiences with the book, and I hope that will explain why I like it so much.

    (more…)

    2010 in 46 surveys: too many

    Posted on

    I take a lot of screenshots: web sites in general, forms specifically, and of course surveys. Those specific to a project go into particular folders; the rest are filed by year.

    Today I was hunting through for inspiration for the chapter I’m working on, and got tired of flicking past non-survey things. So I moved all the surveys ones into a separate folder, and then thought: well, how many surveys are there anyway?

    Now, I obviously I do a lot of surveys for professional reasons. I’m a member of one of the internet panels, just to see what they’re like. I’m working with a market research client on the user experience of their surveys. I’ve been creating a few examples of my own, and my husband has been investigating what you can get for free in a few survey tools. And if a friendly person asks me to advise informally on a survey for a volunteer or professional group, chances are that I will. So I discounted them.

    That left 46 surveys that I actually completed in 2010.

    Nearly one a week.

    I’m sure I’m far more willing than most people to have a go at a survey, because I’m collecting materials for this book. If I possibly can, I’ll do them, but even then I’d say that I’m only able to do about a third of those that I’m asked to do. Just too busy.

    Can it really be true that I’m getting around 3 survey invitations each week?

    During March, I’ll try to record every invitation as well as the completions.

    But maybe there’s a clue here about the general decline in response rates: there are just way too many of them.

    Surveys in the news: Valentine’s day

    Posted on

    It was a familiar type of email, and one that I’d usually just delete, but in the interests of this book I opened it. “Valentine’s Day Romance Survey Results” from Fresh Flowers and Gifts in Australia. The same material is repeated on their web site, but I assume that it’s a seasonal promotion so here are the key points. I’m guessing that the panel referred to in the survey consisted of a
    couple of people in the Fresh Flowers and Gift’s marketing office. In
    other words, they made it up. No worries, the survey was just for fun
    and that comes across pretty clearly in the results.

    Read on and enjoy – and then I’ll discuss some more scientific surveys.

    (more…)

    SUS: a good enough usability questionnaire

    Posted on

    One challenge of survey design is whether to:

    • use an existing questionnaire, or
    • roll-your-own, or
    • do some sort of hybrid.

    One of the best-known usability questionnaires is SUS. Is it good enough?

    I’m going to start by mentioning the advantages and disadvantages of reusing questionnaires, and then talk about SUS in more detail.
    (more…)

    Survey book of the month, January 2011

    Posted on

    You’re looking for a book on surveys, and my one isn’t yet out. What do to?

    I thought I’d help by sharing some of my favorites over the next few months. These will be the titles that I find myself turning to again and again, whether to check a point I’m writing about or when I’m designing a survey myself.

    Top of my list has got to be “Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method” by Don A. Dillman, Jolene D. Smyth and Leah Melani Christian (Wiley, 2008)

    A book with practical insights backed by research

    Why do I like it so much? Well, it certainly isn’t a visual feast. The cover isn’t all that inspiring, and here’s a sample page of content – deliberately made small so you can get an inkling of what the design is like.

    dillman-sample.jpg

    Yes, that’s a solid block of text there, and it’s fairly representative.

    The value in this book is all in what it says. Don Dillman has been researching surveys for 50 years. He’s the author or co-author of over 240 papers, and he’s co-written 9 other books in addition to this book and its three previous incarnations under slightly different titles. All this might lead you to suspect that the book is a dry, academic tome that will be hard work. You’d be wrong.

    Yes, it’s impeccably referenced throughout. But it’s also practical. This is a team that believes strongly in testing everything, varying everything, and testing again – in one-on-one usability tests, A/B tests and experiments. I opened my copy at random just now and here’s a typical passage:

    “Often web survey designers and programmers are on the cutting edge of computer technology, have the most recent versions of web browsers, and are adept at customising their own settings. However, it is important to design and test the web survey from the respondent perspective; the respondent may not be as computer savvy or familiar with being online. In one of our recent student surveys, we decided to send a tester out to a number of different computer labs at our university because many respondents would be completing the survey on campus. In the process of testing, it was discovered that two labs on campus had not updated their browsers in 5 years. Although many students were aware of that and avoided using those labs, others did not know and may have completed the survey on those computers with the outdated browsers”.

    I’m sure it’s not news to you that there is often a gap between what your developers are using as their technology and what the users might know. But how many academic researchers do you know who would be willing to go from lab to lab checking something like that? How many people, when they start to design a survey, ask themselves “How many of our respondents are likely to be on old browsers?” before you raise that point with them?

    You may be thinking “OK, but what will I learn in that book that I don’t already know?”

    My answer: lots. To show you what I mean, here’s another picture. When I got my copy of this latest edition (pre-ordered, based on avidly reading and re-reading the two previous editions), I went through it marking only the most crucial points that I knew I would definitely need to think about for my book on surveys. Then I went back and challenged myself: is this point absolutely essential for the busy user experience practitioner? And here’s the photo of the results of that two-step process, showing a host of markers.

    dillman.jpg

    What you won’t find in the book

    If you’re still with me, you probably have another question: “If this book is that good, why are you writing another one?”

    Even though the book is over 500 pages long, it doesn’t cover the entire survey process. Dillman and his co-authors assume that:

    • you have a clear set of goals for the research
    • you have the resources to do a proper random sample, and a multi-stage administration process
    • you know how to analyze the data.

    So there are some aspects missing that we probably ought to think about it – and also, we don’t all have time to read that much. So I’m still working on my (much shorter) book.
     

    Writing and looking for stories

    Posted on

    So, what’s been happening with the surveys book for the last three months? A lot!

    • Wrote the first chunk of the book, on rating scales (sneak preview: I come out in favour of 5 points).
    • Went to Australia for a holiday and the OzCHI conference. Took the opportunity to do a bunch of interviews with Australian UX people on how they use surveys. Lots of ideas.
    • Taught a revised version of my surveys tutorial at OzCHI. Lots more ideas, and a new outline for the book.
    • Came back, went to Lausanne for a day and got caught in the UK snow in December. Got to know every inch of Geneva airport and practice my British skills in queuing.
    • Enjoyed Christmas.
    • Created a revised outline for the book based on all those ideas from Australia.
    • Didn’t enjoy the New Year at all due to a most annoying cold that turned into conjunctivitis.

    Now I’m better and catching up with my New Year resolution to post more on this blog. And I know that Lou is watching me… so here is the start.

    A month of gathering ideas

    Posted on

    September 2010 flew by:

    • the excitement of announcing this book;
    • off to HCI2010 at the University of Abertay in Dundee, where I led a workshop on survey design
    • back to England and right down to Cornwall for a few days holiday (I know we’re a small island here, but that’s more than 500 miles)
    • a quick visit to CX Partners in Bristol for a fascinating informal chat about surveys
    • analysing the data and writing the first draft of a report on a survey about how different groups use a camp site for people with physical disabilities
    • back home – where, amazingly, the ‘work while you’re away’ fairies had failed to deal with my in-tray
    • a great telephone meeting with my development editor
    • and a visit to the UK UPA evening about books, where I picked up some tips about writing and some contact details from UX professionals who are willing to talk with me about surveys.

    I’ve also had some great email conversations offline, especially with Inge De Bleecker.

    Lots of ideas buzzing around in my head, but not a second to tell you about it here and get your opinions. Frustrating!

    But please keep those ideas coming, either here or directly to me. I’ll start discussing them here soon, I promise.