{"id":183617,"date":"2011-10-17T14:16:16","date_gmt":"2011-10-17T14:16:16","guid":{"rendered":"https:\/\/staging.rm.gfolkdev.net\/?p=183617"},"modified":"2022-09-28T16:52:20","modified_gmt":"2022-09-28T16:52:20","slug":"introduction-to-usability-testing-for-survey-research","status":"publish","type":"post","link":"https:\/\/rosenfeldmedia.com\/introduction-to-usability-testing-for-survey-research\/","title":{"rendered":"Introduction to usability testing for survey research"},"content":{"rendered":"
It’s always fascinating to encounter a profession with overlapping interests to our own in UX. The one I’ve been learning this year is survey methodology, and was delighted to find out that they’re really into UX as well.<\/p>\n
The short version of this post: I joined forces with Emily Geisen, a survey methodologist to teach a workshop on usability testing at the SAPOR conference. The slides are here: Introduction to usability testing for survey research<\/a>.<\/p>\n The longer version? Keep reading, for:<\/p>\n <\/p>\n This year, I’ve I’ve written about two of these events before:<\/p>\n This post is about my recent visit to the Southern Association for Public Opinion Research conference (SAPOR) to talk about usability testing. <\/p>\n Testing A pilot test is a full run-through of the survey:<\/p>\n You From Cognitive When Traditionally, Despite these, and With online surveys, we have a computer. So that means we have human-computer interaction (HCI). And with HCI, SAPOR is the Southern Association for Public Opinion Research, We had an enthusiastic Which brings me back to the short version of this post: you can find our slides here: Introduction to usability testing for survey research<\/a><\/p>\n","protected":false},"excerpt":{"rendered":" It’s always fascinating to encounter a profession with overlapping interests to our own in UX. The one I’ve been learning this year is survey methodology, and was delighted to find out that they’re really into UX as well. The short version of this post: I joined forces with Emily Geisen, a survey methodologist to teach … Continued<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_seopress_robots_primary_cat":"","_relevanssi_hide_post":"","_relevanssi_hide_content":"","_relevanssi_pin_for_all":"","_relevanssi_pin_keywords":"","_relevanssi_unpin_keywords":"","_relevanssi_related_keywords":"","_relevanssi_related_include_ids":"","_relevanssi_related_exclude_ids":"","_relevanssi_related_no_append":"","_relevanssi_related_not_related":"","_relevanssi_related_posts":"","_relevanssi_noindex_reason":"","footnotes":""},"categories":[26],"tags":[],"acf":[],"_links":{"self":[{"href":"https:\/\/rosenfeldmedia.com\/wp-json\/wp\/v2\/posts\/183617"}],"collection":[{"href":"https:\/\/rosenfeldmedia.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/rosenfeldmedia.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/rosenfeldmedia.com\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/rosenfeldmedia.com\/wp-json\/wp\/v2\/comments?post=183617"}],"version-history":[{"count":4,"href":"https:\/\/rosenfeldmedia.com\/wp-json\/wp\/v2\/posts\/183617\/revisions"}],"predecessor-version":[{"id":186266,"href":"https:\/\/rosenfeldmedia.com\/wp-json\/wp\/v2\/posts\/183617\/revisions\/186266"}],"wp:attachment":[{"href":"https:\/\/rosenfeldmedia.com\/wp-json\/wp\/v2\/media?parent=183617"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/rosenfeldmedia.com\/wp-json\/wp\/v2\/categories?post=183617"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/rosenfeldmedia.com\/wp-json\/wp\/v2\/tags?post=183617"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}\n
My visits to survey conferences<\/h4>\n
\nattended three conferences for survey methodologists because I wanted to
\n learn about the most recent research findings on surveys. I’m also
\nfortunate that I’m working with Kantar, one of the world’s leading
\nmarket research and insight businesses, helping them to understand and
\nimprove the user experience of their surveys, and they are very
\ncommitted to sharing their knowledge and best practices.<\/p>\n\n
\n 5th Internet Survey Methodology Workshop, which I attended at
\nStatistics Netherlands in the Hague. We were given a charming and
\ninspirational book about Dutch statistics that I chose as my August Book of the Month<\/li>\n<\/ul>\nSurvey methodologists run pilot tests<\/h4>\n
\n is nothing new for survey methodologists. All of the leading text books
\n tell us firmly that if you don’t have the resources to do a pilot test
\n(also known as a field test) then you don’t have the resources to do the
\n survey.<\/p>\n\n
\n may ask: so what’s the difference between all of that and the full
\nsurvey? Answer: not a lot – the only difference is the sample size. Your
\n full survey goes to everyone in your sample; your pilot test goes only
\nto a small sub-sample – perhaps 10 to 50 respondents for a
\nstraightforward survey. <\/p>\nSurvey methodologists do cognitive interviews<\/h4>\n
\n the UX point of view, we can see the value in a pilot test – but isn’t
\nthat a bit late? What if the actual questionnaire doesn’t make sense to
\nthe respondents? Survey researchers also see that as a problem, and have
\n been conducting ‘cognitive interviews’ since the 1980s. <\/p>\n
\n interviews focus on questions. Typically, an interviewer reads out a
\nquestion and asks a respondent to think aloud while answering it. If you
\n read this “how to” guide to cognitive interviewing,
\n you’ll find that cognitive interviewers have exactly the same problems
\nwith persuading respondents to think aloud as we do in usability tests. <\/p>\n
\n survey methodologists got into cognitive interviewing in the 1980s,
\nmost surveys were done by telephone or face-to-face with an interviewer.
\n The respondent didn’t have to read or click anything; all the
\ninteraction was between respondent and interviewer. So interaction has
\nnever been a core concern in cognitive interviews.<\/p>\nSurvey methodologists are becoming enthusiastic usability testers<\/h4>\n
\n survey methodologists have been quite cautious about online surveys.
\nThey are concerned about many difficulties such as:<\/p>\n\n
\n although most people have Internet access, there are still important
\npopulations who do not, or who prefer not to access surveys online.<\/li>\n
\n frames: there are comprehensive directories of street addresses, and of
\n landline telephone numbers, but no easy way of getting hold of a
\nstatistically robust sample of email addresses.<\/li>\n
\neffects: a human interviewer asking a question is a different mode to a
\ncomputer asking the same question. People behave differently when
\ninteracting with another person compared to interacting with a computer,
\n and their answers differ as well. <\/li>\n<\/ul>\n
\nother, challenges, the survey methodologists see the obvious advantages of speed, convenience
\n and cost. The workshop I attended at the Hague was the 5th in that
\nseries – and as they only hold the workshop every 18 months to 2 years, that’s been quite a long-running series. The
\n Web Survey Methodology Bibliography<\/a> has plenty of entries dating back to the 1990s. <\/p>\n
\nwe have usability, and usability testing. Many of us in UX have clients who are
\n long-term enthusiastic adopters of usability testing, and others who
\nseem never to have thought of it. That’s exactly the same in survey
\nmethodology: some are very experienced, others are only just getting
\ninto it.<\/p>\nIntroduction to usability testing at SAPOR
<\/h4>\n
\n a chapter of AAPOR. The SAPOR annual conference is held in Raleigh,
\nNorth Carolina, which happens to be where Emily Geisen works at at RTI International<\/a>.
\n(RTI was originally Research Triangle Institute; the Research Triangle
\nhas one of its corners in Raleigh). She contacted me and suggested that
\nwe co-present a workshop on usability testing. It was great fun to work
\nwith her, as it was obvious that she’s got a lot of first-hand
\nexperience of usability testing surveys as well as her extensive
\nexperience as a survey methodologist. <\/p>\n
\ngroup of attendees, who were researching a fascinating range of topics
\nincluding: student opinions at universities, household expenditure, the
\ncare of prisoners, radio listening, and (clear winner for ‘most unusual
\ntopic’) elephant keepers. We had a great time and I’d like to thank Emily, RTI and SAPOR for making it possible for me to be there.<\/p>\n