NEW BOOK! We Need to Talk: A Survival Guide for Tough Conversations

Rosenverse Terms of Service

Sample Chapter: Duly Noted

This is a sample chapter from Jorge Arango’s book Duly Noted: Extend Your Mind Through Connected Notes. 2024, Rosenfeld Media.

Chapter 1

Notes Are for Thinking

When I was a boy, the beginning of school was one of my favorite times of the year. One day always stood out: when my mom took me to buy stationery. I loved getting new pens, pencils, and notebooks. A fresh notebook held the promise of clarity, order, and better grades. I mostly used Mead Trapper Keepers, a popular brand of loose-leaf binders. They represent how I managed notes as a kid: I’d write down what I heard during class and stash pages in that subject’s section, in chronological order. While studying, I’d revisit those notes. Occasionally, I’d discard old ones to make room.

By the end of the school year, I had a binder full of transcripts. They’d served their purpose, so I could toss them. Next year would bring new teachers, new classes, and new notebooks. I seldom revisited old notes. This basic approach was the start of my note-taking life. I’ve since learned that notes can be more than a means for capture and recall: they’re also a medium for thinking.

What Are Notes For?

My Mac’s dictionary defines a note as “a brief record of facts, topics, or thoughts, written down as an aid to memory.” But as with other common words, “note” has more than one meaning. We also speak of some financial instruments as notes. And, of course, notes are also the stuff of musical melodies. But in this book, we mean the first usage: brief written records that aid our minds.

Not everything you write down is a note. For one thing, as the definition says, notes tend to be short. Think sticky notes, not essays. Intent also matters: you make notes primarily to aid your thinking. Sometimes you write notes for others, but most often you do so for yourself. Some notes you “dash off,” while others you ponder. Most aren’t meant for publication; I’ve made many notes while writing this book, but writing the book’s text is different from note-taking. All notes augment your mind in different ways.

Remembering

Remembering might be the most common reason to take notes: you hear or see something you want to recall later. This is why, when you call a company’s help desk, the agent suggests you have a pen and paper at hand. It’s good advice: such calls yield case numbers, dates, and other details that you’ll forget quickly if you don’t write them down.

Transcribing

A common reason for taking notes is to recall what you heard during a lecture or video. For example, when attending a presentation, you may type into your laptop or scribble in your notebook. Doing so has a dual benefit: it helps you pay attention and produces a text that reminds you of what the speaker said.

Recording

Some professions, such as research scientists and medical doctors, benefit from keeping records of their work. This is a kind of remembering, but a bit more formal. It’s worth examining separately since such notes also provide legally admissible evidence when outcomes are contested. People in professions that require it take great care with their notes.

Learning

Sometimes you write things down not because you’re trying to remember a particular detail but because you’re trying to learn about a subject. Learning entails more than just remembering facts. For one thing, you must connect ideas at different levels of abstraction. For another, learning often happens in sessions spread over several days, weeks, or months, as in a class. Much of the note-taking discussed in this book focuses on learning.

Researching

When researching a subject, you want to recall the salient facts. And if you’re interviewing someone, you want to keep track of the most important things they said. In either case, you’re ultimately looking to synthesize what you learn so you can make better decisions. Notes aid the process.

Generating

Sometimes you take notes not to remember or learn something, but to generate new ideas. This is one of the most exciting uses of notes: your notebook becomes a collaborator in the thinking process. Putting thoughts down on paper (or on the screen) gives you fodder for reflection, leading to other ideas that you also capture.

You’ve experienced this when brainstorming using sticky notes on a whiteboard. Seeing notes on the board suggests other ideas. You move them around to form clusters, suggesting further ideas. A virtuous process follows.

Planning

Many people live by their agendas and bullet journals. When you have many things to do or track—as is often the case when managing long and complex projects—it helps to write things down. Sitting down with a calendar and a sheet of paper will help you plan more effectively than if you had to keep everything in your head.

Imagine you’re going on a trip, so you make a checklist of items to pack. Seeing items on the list will remind you of things you may have missed at first. You may also consider the priority of items on the list. (For example, your passport should probably be first.) Visualizing items and the relationships between them helps you prepare for the trip.

Communicating

Although you take most notes for your own sake, you also leave some for others. For example, I sometimes find food containers in our refrigerator with a sticky note that says, “Papa, don’t eat!” My kids know that, without this note, their snack might soon be gone. This fits the definition of “brief record,” even if it’s not meant for “recall.” These notes turn your surroundings into shared cognitive environments.

Fidgeting

While writing this book, I asked people on Twitter why they take notes. Bastiaan van Rooden memorably replied, “To slow down the monkey in my head.” I can relate: many people pay better attention when their hands are busy. In this case, the primary benefit of scribbling things down is keeping your attention focused; the marks on paper are a nice secondary benefit.

Note-Taking Media

Not only are there many reasons for taking notes, but there are also just as many different ways to do so. You can doodle with a pencil in a notebook, write with a marker on a sticky note, type into an app on your phone, draw with chalk on a sidewalk, or tie a string around your finger. In a pinch, you may even write on your skin.

While walking around Tahoe City, CA, my daughter saw a store she wanted to return to. Lacking paper, she wrote down its name on her hand.

Which is to say, you can make notes from whatever is handy—what matters is catching and preserving fleeting thoughts and observations. That said, it helps to be intentional: different note-taking media are suited for different needs.

Pen and Paper

There are good reasons why paper-based notes remain popular. With a bit of care, paper lasts a long time. Paper requires no batteries, and you don’t need a special app or device to read your notes; the paper itself is the medium. Paper is also portable and fast.

But paper also has its downsides. Copying paper-based notes requires specialized equipment (e.g., a photocopier) and lots of time. While notebooks are portable, large paper-based repositories (e.g., collections of notebooks) aren’t. You can’t search paper or link notes easily to each other. And with bound notebooks, you can only view notes in the order they were written. (Unless they’re indexed, which also takes time.)

Here is part of my collection of paper notebooks, spanning two decades. The only way I can find stuff in most of these books is if I know the date when I wrote the note.

Index Cards

Index cards are a convenient way around the constraints of notebooks. Since cards aren’t bound together, you can easily re-sort them. They’re ideally sized for note-taking: smaller than regular sheets of paper, but large enough to capture a single idea in some detail. And because they use thick stock, they stand up to manipulation.

You can use boxes to keep cards organized. When archived carefully, index cards provide one of the advantages of digital notes: random access. That is, you can jump directly to the note you need without having to flip through the rest. They don’t need to be stored in the order they were written; you can archive them alphabetically or in any other organizational scheme.

Because of this flexibility, index cards are a popular thinking medium for researchers and authors who keep and refer to lots of notes. Ryan Holiday, author of several popular books, says that his index card–based note-taking system:

has totally transformed my process and drastically increased my creative output. It’s responsible for helping me publish three books in three years (along with other books I’ve had the privilege of contributing to), write countless articles published in newspapers and websites, send out my reading recommendations every month, and make all sorts of other work and personal successes possible.

Holiday’s system consists of individual index cards with a single thought or quotation on each one. He writes a category label in the top-right corner of each card and stores these cards in one big box. But when working on a specific project, such as a book, he uses a smaller dedicated box for the project. Holiday learned this approach from his mentor, the author Robert Greene. Other authors, such as Vladimir Nabokov, also used index cards to organize their work.

Marginalia

Underlining key sentences and writing ideas on book pages is a common way of taking notes while reading. The obvious advantage is that note-taking happens in context: you capture ideas near (or on) the texts that sparked them, so they’re easier to understand later. But this is also their main downside: since they don’t stand on their own, these notes are harder to reorganize or relate to other notes.

E-books have an edge here. Digital marginalia can be more easily referenced, searched, backed up, and synced. But some people like to mark up physical books with a pen or highlighter. Many of the ideas in Holiday’s index cards come from his reading: he annotates books and articles as he reads them, marking passages that stand out and writing thoughts in the margins as he goes along.

Sticky Notes

Personally, I don’t like writing in books—but I still want the advantages of taking notes in context. Sticky notes provide a way around this dilemma: I read with a pad of small stickies and a pencil. Whenever I find an idea or passage that resonates, I write a few words on a sticky note and paste it on the book’s margins, so it protrudes from the page. In this way, it doubles as a bookmark.

Of course, sticky notes are helpful for more than annotating books. They’re also a mainstay of workshops, design studios, and other situations that require groups of people to think together. Using sticky notes, it’s easy to turn walls, whiteboards, windows, tabletops, and other ordinary surfaces into temporary placeholders for ideas. (More on this in Chapter 10, where we’ll discuss collaborative note-taking.)

Sticky notes’ main advantage is that they can be attached and reattached nearly anywhere on a smooth surface. Because of this, they’re ideal for exploring relationships between ideas. You can paste notes in any sequence and reorganize them later. That said, larger sticky notes aren’t suitable for storing ideas long-term; it’s impractical to keep walls and whiteboards covered in notes. (Of course, this doesn’t apply to the small sticky notes used to annotate books.)

Photographs

One way to get around sticky notes’ ephemeral nature is to take pictures of the wall or board before taking down the stickies. While photos aren’t strictly about making marks, they can be an effective way of capturing ideas and observations. For example, whenever I park in a large, unfamiliar parking lot, I take a photograph of a nearby landmark so that I can find my car later.

I took this photograph in the parking lot of Disney’s Animal Kingdom theme park. Note I didn’t bother with proper framing; I didn’t expect to use this photograph for anything other than finding where I’d parked.

People used cameras as note-taking devices well before personal computers arrived on the scene. Around the mid-Twentieth Century, film-based cameras got small, fast, convenient, and inexpensive enough to serve effectively in this role. In a presentation to shareholders, Polaroid founder Edwin Land described the original (1944) concept for instant photography as

a kind of photography that would become part of the human being, an adjunct to your memory; something that was always with you, so that when you looked at something, you could, in effect, press a button and have a record of it in its accuracy, its intricacy, its beauty—have that forever.

The main advantage of photographing things you want to remember is convenience. You likely have a phone with an excellent camera in your pocket; no note-taking method is faster than taking it out, pointing it at something, and shooting. The obvious downside is that you’re limited to capturing what you see in the world—great for remembering where you parked, but less so for recalling abstract ideas.

Audio and Video

One way to remember what you were thinking is to record yourself saying it. Recording equipment used to be fiddly, bulky, and expensive, but smartphones have made recording ubiquitous. Software can transcribe your recordings so you can read and search for what you said.

Recordings are most effective at capturing what somebody says with high fidelity. While you may lose some thoughts when handwriting or typing notes, a recording will capture everything verbatim. This is also its downside: recordings don’t benefit from the real-time synthesis you do when using a slower medium. Still, some people love to record themselves “speaking their minds.”

Digital Notes

Most of the note-taking means we’ve highlighted so far existed in some form before computers. But now, we’ll focus specifically on digital note-taking. Almost five billion people have a smartphone, and many also use laptop or desktop computers. Most of these devices include note-taking apps, and people use them to write down all sorts of things, ranging from shopping lists to book notes.

Many digital note-taking applications mimic the abilities and superficial characteristics of analog (i.e., “real-world”) note-taking media. For example, Macs include an application called Stickies that lets you place sticky notes on your computer desktop. Well, not really: Stickies places a series of pixels that look like sticky notes on another series of pixels that function like a “desktop.”

WORKING NOTE: Install Obsidian

In the late 2010s and early 2020s, new digital note-taking tools appeared that brought to market capabilities previously available only via specialized software and often in research contexts. Several are worth exploring and investing in. But to make the ideas in this book more tangible, we’ll focus on one such tool: Obsidian.

Obsidian is a commercial software application created by a small team. As of this writing, it’s free for personal use and available on all major desktop and mobile computing platforms. It embodies key principles we’ll explore in this book, so it’s a great tool for learning. That said, you could use other tools to implement these practices. We’ll just focus on Obsidian for illustration.

(If you’re already using Obsidian or a comparable note-taking app, feel free to skip the rest of this section.)

To start, download and install Obsidian on your computer or mobile device. If the former, you can install it by visiting https://obsidian.md and following the instructions. If the latter, you can search for Obsidian in your device’s app store and install it from there.

Obsidian’s first screen gives you several options. If you’re using the software for the first time, you can either select Quick Start or Create a new vault. (The other options are mostly for use by existing Obsidian users.)

When you first launch Obsidian, you’ll have the choice to create a new vault. In Obsidian, a vault is where you store notes. You can create and manage as many vaults as you want. For example, I currently manage two vaults: one where I manage projects and another that serves as my primary long-term knowledge repository. To use Obsidian, you must create at least one vault.

Under the hood, a vault is simply a folder on your computer containing plain text files. So, if you decide to move on, you can still access your data in a universally compatible format. Go ahead and create your first vault and look around Obsidian’s user interface. In the next chapter, you’ll create your first note. But for now, just become familiar with the software.

After you create your first vault, Obsidian will open without any note selected. This is understandable since you haven’t created anything yet.

Obsidian’s user interface has a panel on the left that lists your available notes and a panel on the right that shows the currently open note(s). The parenthetical plural is because Obsidian lets you open several notes simultaneously, which you can view either in tabs or side-by-side.

Think of a typical war movie scene: a ragtag platoon planning an attack. The soldiers huddle close to the ground and, absent paper and pens, draw (literal) lines in the sand and move rocks and twigs to represent targets and units. The commander might say something like, “Hutch, you and Charlie stand here and give cover while Mack and I rush the compound.” One soldier might ask a question, and another will contribute an idea—all facilitated by a few rocks and dirt.

The improvised map gets the platoon “on the same page,” so to speak. By representing the battlefield as tangible things they can manipulate, they can better think through and communicate their plans, spotting obstacles and opportunities they might miss if they were trying to imagine the situation in their heads. You may have had similar experiences when working with colleagues around a whiteboard.

Once you understand that you think with things, you can explore ways to augment your thinking. The battlefield map and the whiteboard are examples of augmentations that are useful when collaborating with others. Notes are a similar augmentation. As with the whiteboard, you can use them to think collaboratively, but they’re also very useful when thinking by yourself.

In Genius, his biography of Richard Feynman, James Gleick writes about the role of notes in the physicist’s work. Starting from an early age, Feynman worked out problems in his notebooks. Later in life, in an interview with MIT historian Charles Weiner, he explained the role of his notes. Gleick writes,

He began dating his scientific notes as he worked, something he had never done before. Weiner once remarked casually that his new parton notes represented “a record of the day-to-day work,” and Feynman reacted sharply. “I actually did the work on the paper,” he said. “Well,” Weiner said, “the work was done in your head, but the record of it is still here.” “No, it’s not a record, not really. It’s working. You have to work on paper, and this is the paper. Okay?”

To emphasize the point, notes aren’t merely a way to record your thinking; they’re part of where thinking happens. They are the means through which you understand and make sense of things. When making notes, you’re thinking on the page and beyond, experimenting with temporary models that describe how a part of the world might work. It’s a creative, generative act of discovery and clarification.

NOTABLE NOTE-TAKER: Gretchen Anderson

Gretchen Anderson is a product consultant, coach, and author based in the San Francisco Bay Area. While working on her book Mastering Collaboration, Gretchen built an outline using her computer. She used this outline to think through the high-level ideas in the book. However, the outline became constraining as she got into the details. “I started to lose track of it,” she explained. So, she switched to using physical sticky notes:

I was doing this at home, where I don’t have a whiteboard…But I do have lots of windows, so at one point, I busted out the sticky notes and had the medium-sized ones and one color for chapters and smaller different colored ones for main points and the stories that would buttress them so that I could create that kind of map that I could see all at one time. Interestingly, that happened late in the process, maybe three-quarters of the way through. You know, I was kicking myself like, “Gretchen, you know that you could have done this earlier!”…I probably couldn’t have done it any other way. I started out with an outline, I changed that outline to be something that was looser so that I could fit everything I was learning into it, and then I needed to kind of remix it again to make it something that people could follow and not just have it be a laundry list of stuff I learned.

By commandeering her walls and windows, Gretchen literally expanded her thinking surface. Moving from an outline to a two-dimensional map of ideas allowed her to see everything at once and “remix” it into a sequence that her readers could follow. Note that she used sticky notes of different colors and sizes, which allowed her to distinguish different structural elements at a glance.

Outlines are great for exploring hierarchical relationships, but not as effective when you want to visualize lots of stuff at the same time. Switching how you’re taking down ideas is a common way to get unstuck in complex creative projects. Different media have different capabilities and constraints: it’s important to be aware that you might have to switch at some point depending on what you’re doing.

Endnotes

As you see, there are many ways of taking notes and many reasons for doing so. But ultimately, you do it to extend your cognitive abilities. Thinking clearly is fundamental to everything you do, so mastering notes will help you in many aspects of your life.

As I mentioned in the introduction, this book focuses on digital note-taking. We’ll look beyond comfortable metaphors to new means of exploring ideas that are only feasible with computers. You’re not building a better Trapper Keeper, but something entirely different and more exciting. But I’m getting ahead of myself. Before you start building your note-taking system, you need to cover a few fundamentals.

back to Duly Noted

Erica Jorgensen on Tools and Techniques for Testing your Content

Deliberate Intervention: Using Policy and Design to Navigate the Harms of New Technology

Training

Sample Chapter: Interviewing Users (2nd edition)

This is a sample chapter from Steve Portigal’s book Interviewing Users: How to Uncover Compelling Insights (2nd edition). 2023, Rosenfeld Media.

Chapter 1

Interviewing Addresses a Business Need

A few years back, I worked with a company that had the notion to turn a commodity safety product—the hard hat—into a premium product. They would incorporate advanced features and then charge a higher price point. I don’t actually know where their idea came from, but one can imagine that they had seen all kinds of everyday products be reformulated to generate a higher scale of profit (think about Starbucks, gourmet salt, smartphones, Vitamix blenders, or horsehair mattresses). They sketched out a set of features that would improve the functional performance of the hard hat.

When I interviewed people who wore hard hats for work, I didn’t ask them to evaluate the features my client had been considering. Instead, I asked them generally about their work, so I was able to uncover insight into the most significant aspects of their experience. What they were concerned about fell into an entirely different category. They talked about leaving the job site to get lunch (for example) and how awkward they felt among other people while dressed in their prominent, brightly colored safety equipment. Indeed, makers of other safety equipment like bicycling helmets, safety footwear, and safety goggles had already redesigned their products to echo fashionable caps, boots, and sunglasses, suggesting this concern was being felt broadly.

If there were to be a TEDx version of this story, then this team would have become very excited about this new and surprising area of opportunity, despite it being different from what they had already invested in (financially, intellectually, and even emotionally). They’d have torn up those plans, drawn up new ones, and eventually raked in the dough. But you know that isn’t really how these things play out! In these interviews, we uncovered a significant business risk in pursuing their existing idea, so they stopped product development for their hard hat with extra functionality. On the other hand, these interviews identified another opportunity: to produce a hard hat that would address the issue of social performance. That wouldn’t have fit with their organization’s technical or cultural competencies, so they chose to avoid the business risk of developing a fashionable hard hat. What we learned from these interviews informed their decision not to bring any product to market.

When you get down to it, that’s what we do as user researchers: We gather information about users in order to inform critical decisions about design, product, or other parts of the business or organization. To do this means that we go to people’s homes, their offices, wherever their context is. We ask what they do. We ask them to show us. We get stories and long answers where we don’t always know what the point is. We want them to explain everything about their world to us.People may not have a ready answer as to why they do something, but we have to listen for why. We have to ask follow-up questions and probe and infer to try to understand, for ourselves, just why some-thing is happening the way it is. We make sense of this disparate information and show the way to act on what we’ve learned.

Interviewing is a specific method in user research to accomplish these goals. (User research is also referred to by other terms such as design research, user experience research, or UXR.) This book is about interviewing users (also referred to variously as site visits, contextual research, or ethnographic research) as a method to conduct user research, so beyond an in-depth examination of best practices for interviewing users, we’ll also consider user research in general. And we’ll also look at other user research methods that can be integrated and combined with interviews.

Nomenclature aside, the broad outline for interviewing users is:

  • Thoughtfully planning out objectives, who we’ll interview, and how we’ll go about it
  • Deeply studying people, ideally in their context
  • Exploring not only their behaviors, but also the meaning behind those behaviors
  • Making sense of the data using inference, interpretation, analysis, and synthesis
  • Using those insights to point toward a design, service, product, or other solution

Learning About Users to Inform Decisions

Typically, when you interview people, you visit your users in their homes, their offices, their cars, their parks, and so on. But this isn’t always the case. When planning a project, ask yourself if it’s more insightful to bring participants in to see your stuff (say, prototypes you’ve set up in a facility meeting room) than it is for you to go out and see their stuff. Overall, your objective is to learn something profoundly new. (There are situations where quickly obtained, albeit shallow, information is beneficial, but that’s not what we’ll focus on here.)

Note: Every organization can benefit from research

Sometimes, companies declare that they don’t need to do user research. What they typically mean is that they don’t need to do generative user research (learning about people in order to identify product opportunities), but they are probably doing evaluative user research (testing the thing they are developing to make sure it’s usable by people). Denying the value of generative research (because, as they might say, people don’t know what they want and it’s the company’s mission to invent that anyway) belies a poor understanding of how user research is conducted and applied. For one thing, it’s not simply asking people “what they want.”

For another, it’s not credible that they possess an innate talent for building stuff that people love. Even if they themselves are users of the snowboards, photography equipment, or mixing gear that they make, they will choose and use those solutions differently than someone who is not inside their industry. They will be blind to differences in income, access, use cases, and so on. And they will have difficulty expanding their offering in an innovative way, because they are stuck in this model of being the user.

Often, the stated goal of interviewing users is to uncover their pain points. This approach mistakenly characterizes research with users as a sort of foraging activity, where if you take the effort to leave your office and enter some environment where users congregate, you’ll beheaded home with a heap of fresh needs. You can observe that people are struggling with X and frustrated by Y, so all you have to do is fix X and Y, and then all will be good.

Although this may be better than nothing, a lot of important information gets left behind. Insights don’t simply leap out at you. You need to work hard and dig for them, which takes planning and deliberation. Further complicating the foraging model is that what people in problem-solving professions (such as designers and engineers) see as “pain points” aren’t necessarily that painful for people. The term satisficing, coined by Herbert Simon in 1956 (combining satisfy and suffice), refers to people’s tolerance—if not overall embracing—of “good-enough” solutions.

Once while settling in for a long flight, I noticed that a passenger in the row in front of me had fashioned a crude sling for their iPhone using the plastic bag that the airplane blanket came in. They had twisted the bag into a makeshift “rope,” which they looped around the body of the iPhone and then jammed behind the latch that kept the tray table closed. They now had a (slightly askew) solution for watching their own device for the duration of the flight. Initially, I was critical of the ugly, inelegant result. But eventually, I realized it was beautiful in its own way—it was fashioned from the materials they had on hand. Since then, I’ve seen other examples of passengers making their own viewing solutions, and I’ve made a point of taking a picture. (See Figure 1.1 where the passenger has made an iPhone viewer out of the airline’s credit card brochure and some beverage napkins.)

Interviewing Users Figure 1.1

Figure 1.1 An airplane passenger viewing stand, made from the materials found on board.

Contrast these good-enough solutions with a more purpose-built accessory (see Figure 1.2): the passenger would have to have known about it, purchased it, remembered to bring it, and carried it with them. Of course, the ideal solution—not just the raw materials—would be provided by the airline itself (see Figure 1.3).

Interviewing Users Figure 1.2

Figure 1.2 TabletHookz is an accessory designed specifically to hold a mobile device in an airplane seat back for hands-free inflight viewing.

Interviewing Users Figure 1.3

Figure 1.3 A device holder built into the airplane seat-back allows passengers to watch videos on their own devices.

There have long been spaces online that exhibit samples of makeshift solutions. They are meant to amuse, but usually with a good measure of judgment and schadenfreude (this is the internet after all!). A good exercise for a user researcher is to seek out those images and reflect on what aspects of these solutions are successful for the people who implemented them.

I encounter satisficing in every research project: a computer desk- top with an unfiled document icon in each element of the grid, an overflowing drawer of mismatched food container lids, a not-yet-unwrapped car manual, and tangled, too-short cables connecting products are all “good-enough” examples of satisficing. In other words, people find the pain of this putative problem to be less acute than the effort required by them to solve it. What you observe as a need may actually be something that your customer is perfectly tolerant of. Would they like all their food in containers matched with the right lids? Of course. But are they going to make much effort to accomplish that? Probably not.

Beyond simply gathering data, interviewing customers is tremendous for driving reframes, which are crucial shifts in perspective that flip an initial problem on its head. These new frameworks, which come from rigorous analysis and synthesis of your data, are critical. They can point the way to significant, previously unrealized possibilities for design and innovation. Even if innovation (whatever you consider that to be) isn’t your goal, these frames also help you understand where (and why) your solutions will likely fail and where they will hopefully succeed. To that end, you can (and should!) interview users at different points in the development process. Here are some situations where interviewing can be valuable:

  • As a way to identify new opportunities before you know what could be designed.
  • To refine design hypotheses when you have some ideas about what will be designed.
  • To redesign and relaunch existing products and services when you have history in the marketplace.

From My Perspective: Gaining Insight vs. Persuading the Organization

While doing ethnographic research in Japan, I accompanied my clients as they conducted an unrelated study. They brought users into a facility and showed them elegantly designed forms for printer ink cartridges. They were smooth, teardrop shapes that were shiny and coated with the color of the ink. They also showed users the current ink cartridge design: black blocks with text-heavy stickers.

Can you guess what the research revealed? Of course. People loved the new designs, exclaiming enthusiastically and caressing the shapes. Regardless of method, there was no insight to be gained here. I’ve gone back and forth about whether this was good research or bad research. It didn’t reveal new information, but it provided tangible evidence for the organization. This team’s approach suggested that there were other issues with the design process (perhaps that leaders wouldn’t make decisions without supporting data from users) and while their research might have been the best way to move their process forward, ideally it wasn’t the best use of a research study.

A High-Level Research Plan

The operational aspects of interviewing users will be covered in the next chapter (“Research Logistics”), but here let’s consider the three (plus one special guest) elements of a high-level plan. And by “plan,” it’s less about how you document the plan and more about the thinking that makes for an effective research project. A plan should summarize the project as you understand it at the time, including the business problem, the research questions, and the agreed-upon research method. Reviewing this plan with your team will ensure that you are aligned, with an opportunity to clarify, reprioritize, or expand the work.

Note: The answer to a never-ending story

This book defaults to considering research as projects that have a beginning and an ending. But there are other models. Rolling research is a way of providing designers with regular access to participants who can provide feedback on whatever they are working on. Typically, a small number of participants are scheduled on a weekly basis. Designers and researchers determine earlier in that week what they’ll show to the participants, and what questions they’ll ask. Continuous discovery involves the entire product team, through the entire development cycle, and includes designing, prototyping, and getting feedback from users.

Even if you are interviewing users through one of these approaches, most of the guidance in this book (for instance, Chapter 6, “The Intricacies of Asking Questions”) will apply directly.

The Business Problem

The business problem (or business objective) is what your organization— the producer of products, services, solutions, and so on—is faced with, as shown in Table 1.1.

Table 1.1 Business Problem Examples

Business Problem

    • We’re sunsetting a legacy product and replacing it with one that uses a different technology.
    • Our new product didn’t do as well as we had hoped.
    • We want to move into a new market.
    • A new competitor is taking some of our market share.
    • We’re roadmapping what new features we’ll be developing for our current service.
    • Product feedback is strong but repeat orders are low.

To get an in-depth understanding of the business problem, you’ll probably want to talk with your stakeholders. You’ll learn more about this topic in Chapters 2 and 10, “Making an Impact with Your Research.”

From My Perspective: Uncover Misalignment Early

I once worked with a client who made a digital platform used for particularly complex transactions. They already supported the buyers, sellers, and their respective brokers, and now were looking at opportunities to incorporate the other entities (known as “third parties”) in these transactions. This research was a strategic priority, traceable to goals assigned from on high.

To kick off the project, we scheduled two activities (loosely based on the Questions Workshop3) with different groups of stakeholders. We set up a spreadsheet to capture decisions they were planning to make and what information about these other users would help in making those decisions. In the first workshop, the main project sponsor halted the proceeding to ask “Now, what do we mean by ‘third parties?’” I assumed they knew, and they assumed I knew! I was surprised, but glad they weren’t afraid to ask a “dumb” question. It was a disconnect, but an important one to uncover, and at the right time. We aligned on a definition and then moved forward with the questions. In the second workshop, a stakeholder kicked off the session by telling us “Just so you know, we’re already coding a solution.” Again, I was surprised, but this was very helpful to understand at the outset rather than later.

The Research Question

The research question identifies the information you need from users to provide guidance about the business problem. Whereas the business problem looks inward, the research question looks outward—in other words, the business problem is about you and the research question is about your users (see Table 1.2).

Sometimes the research questions are clustered and nested. For example, the business problem “We are investing heavily in social media and want our customers to promote our services more” might lead to this set of research questions.

  • What do people’s social networks look like? What tools do they use and how are their networks structured?
  • How are purchase decisions driven by the structure of people’s social network (on and offline)?
  • How do people leverage social networks for shopping and other kinds of decision-making? Who has influence with them currently?
  • Who among their social network (and beyond) are trusted sources of information for various decisions and purchases (particularly within the client’s area of business)?

Table 1.2 Research Question Examples

To further inform the research questions, you should review previous research reports, existing products, and in-development prototypes. Look for relevant research findings, explicitly stated assumptions or hypotheses, and implicit hypotheses in the decisions that have already been made.

Note: Find the specificity that’s right for you

When I ask teams to work on articulating their business problems and research questions, they often find it surprisingly challenging, but also enlightening. There won’t be a singular perfect answer, but the process of considering the specifics is valuable for developing a deeper intention and focus for the research. That process might include going back and forth on different variations and wordings. It might not produce a perfectly structured 1:1 relationship between the business problem and the research question. If you practice with a colleague, before long, you’ll have a feel for the right level of granularity and structure for you.

You should also conduct interviews with your stakeholders—they are often consumers of the research findings who are less likely to be involved in the day-to-day study. I typically aim for 6–8 stakeholders, although some clients ask for twice that amount. These are one-on-one conversations that run between 30 and 60 minutes and are used to dig deeper into objectives and set the stage for working collaboratively. Many of the interview techniques in this book (such as what I’ll cover in Chapter 5, “Best Practices for Interviewing”) apply to interviewing stakeholders, although you may find it less comfortable to ask “dumb” questions if you feel your credibility could be at stake. You should ask the stakeholders about the following:

  • Their history with the organization and the research topic
  • Business objectives for the project and specific questions the research should answer
  • Current beliefs about the customer, the user, and the proposed solution
  • Organizational or other barriers to be mindful of
  • Concerns or uncertainty around the method

Even though what you learn will undoubtedly inform all of the activities throughout the project, the immediate output is the research questions—articulating what you want to learn from the interviews.

Note: Get immersed in your research area

With the overall goal of trying to understand the problem space you’re exploring, gathering the language that is used to talk about that problem space, and planning what you’re going to ask your research participants, there are other activities that you can do at this point. Secondary research (also called desk research) gives you a sense of current and historical thinking through what’s been written about your topic already. Look at the mainstream press, the business press, academic papers, internal or external corporate reports, blogs, online forums, newsletters, books, and so on. Identify industry, academic, or other experts and interview them. You may also seek out a few experiences that will give you some perspective on the topic. Look at similar products and how they are being sold online or in retail. Try an experience yourself.

For a project that sought to understand how our client could facilitate a more emotional connection with their customers, we visited a handful of environments that had reputations for successfully bonding with their users (an Apple store; Powell’s Books in Portland, OR; the dog-friendly Fort Funston in San Francisco; a Wawa convenience store in Philadelphia; and Rainbow Grocery in San Francisco) and observed the environment, the people that were there, and hypothesized about what factors were either leveraging or contributing to the relationship. This led to topics to explore in the interviews and examples to compare and contrast with during the analysis stage.

The Research Method

The research method is how you will gather the information needed to answer the research question. Here are a few examples of user research methods (other than interviewing):

  • Usability testing: Typically done in a controlled environment, such as a lab, users interact with a product (or a prototype or simulation), and various factors (time to complete a task, error rate, preference for alternate solutions) are measured.
  • A/B testing: This type of testing compares the effectiveness of two different versions of the same design (e.g., advertisement, website landing page) by launching them both under similar circumstances.
  • Quantitative survey: A questionnaire, primarily using closed-ended questions, is distributed to a larger sample in order to obtain statistically significant results.
  • Web analytics: Measurement and analysis of various data points are obtained from Web servers, tracking cookies, and so on. Aggregated over a large number of users, Web analytics can highlight patterns in navigation, user types, the impact of day and time on usage, and so on.
  • Focus group: This is a moderated discussion with 4 to 12 participants in a research facility, often used to explore preferences (and the reasons for those preferences) among different solutions.
  • Central location test: In a market research facility, groups of 15 to 50 people watch a demo and complete a survey to measure their grasp of the concept, the appeal of various features, the desirability of the product, and so on.

Of course, researchers make up new methods regularly. (See more about methods in Chapter 3, “Contextual Methods—More Than Just Asking Questions.”)

Selecting an Appropriate Method

In the aptly named “When to Use Which User-Experience Research Methods” by Christian Rohrer, the article organizes some of the more common methods into a framework. (Does the method look at people’s behaviors or their attitudes? Is the method qualitative or quantitative? Does the method look at someone’s use of a product?) (See Figure 1.4.) The article provides guidance about which methods are best suited for different contexts. For example, if the goal of the research is to find new directions and opportunities, then the best methods (according to Rohrer) include diary studies, interviews, surveys, participatory design, and concept testing.

Interviewing Users 2 Figure 1.4

Figure 1.4 Christian Rohrer’s “Landscape” organizes user research methods by behavior/attitude and quantitative/qualitative.

Note: Market research and user research

In some companies, market research is a separate department from user research and may even report to different leaders. It also seems like a different career path; people find their way to either discipline from different backgrounds. But what’s the difference? It’s common—but wildly inaccurate—to attempt to distinguish the two by the methods used (market research does focus groups and surveys; user research does interviews and usability testing) or the objectives (market research looks at attitudes and user research observes behavior). Figure 1.4 invites us to consider a bigger picture—a broad set of methods and objectives that no one discipline “owns” exclusively.

Taking a different approach, Sam Ladner developed a guide shown in Figure 1.5 that recommends a research method based on where your product is in its lifecycle.

Interviewing Users 2 Figure 1.5

Figure 1.5 Sam Ladner organizes user research methods by the maturity stage of the product’s sales.

Combining User Research Methods

Interviewing can be used in combination with other techniques. Mixed methods refer to combining multiple methods (typically qualitative and quantitative) together in one study. I’ve used an exploratory interviewing study to identify topics for a global quantitative segmentation study. I’ve combined a Central Location Test (where larger groups watched a demo in a single location such as a research facility and filled out a survey) in parallel with in-home interviews to get a deeper understanding of the potential for the product. I’ve also mixed together different qualitative activities (say, a larger sample for a diary study, and then follow-up interviews with a subset of participants). It can be valuable to combine a set of approaches and get the advantages of each.

Note: Quantitative user experience research

Kitty Z Xu, a quant user experience researcher, explains how this emerging discipline uses two kinds of data: sentimental (such as feelings, perceptions and understanding) from surveys and behavioral (from logging data, usage metrics and more). Researchers in quant UXR make use of skills from a variety of fields, including user research, survey science, data science, and analytics. While interviewing (or qualitative user experience research) looks for insights in a small sample, quant UXR builds insights at scale—meaning collecting hundreds or thousands of samples that are representative of a larger population.

Choosing Interviewing

Interviewing isn’t the right approach for every problem. Because it favors depth over sample size, use interviewing when you don’t need statistically significant data. Being semi-structured, each interview will be unique and reveal something new about what you’re trying to understand (but it can be challenging to objectively tally data points across the sample). Although you are ideally interviewing in context, you are now a participant in that environment. Sitting with users to show you how they use a website isn’t supposed to be naturalistic (versus the way a tool that intercepts and observes users who visit that website captures their actual behavior).

People are not good at predicting their future behavior, especially not for brand-new, hypothetical situations (see “Manage Bias” in Chapter 4). There are bad questions and bad ways of asking questions (see Chapters 6 and 7), but you should be skeptical of broadly dogmatic interviewing advice that warns you never to ask about future behavior, like “How much would you pay for this?” You can definitely ask the question, but it’s important to understand what you can and can’t do with the answer. You won’t get a number that is helpful for your pricing strategy, but you can learn about their rationale for that number or hear a thoughtful reflection about perceived value. Your questions in an interview can reveal mental models that exist today, which will be insightful for the decisions you have made, but the literal responses about future behavior probably won’t be accurate.

Participant Questions

This isn’t really part of the high-level plan, but it’s included here because discussion about the research question sometimes drifts into specific questions that people imagine asking participants. I led a workshop with creative entrepreneurs who struggled to articulate what they wanted to learn from their interviews but were brimming over with what questions they wanted to ask. Because they really were unable to come up with research questions, our workaround was to build out the participant questions and then step back and ask what those questions were collectively in service of (in other words, the research question).

You may generate (or collect) some participant questions during this high-level planning process. Unless they are helpful in getting you unstuck on your research questions, just file them away for now. In Chapter 2, we’ll focus more on the questions we plan to ask.

Aligning on the Research Plan

Since you’re seeing this in a book, where the different elements of the plan (business problem, research question, and research method) are presented in sequence, you might reasonably conclude that you should also proceed linearly. First, get clarity on your business challenge, then uncover your research questions, and then choose the best method to answer those questions! Sounds good?

Ah, but it doesn’t usually work that way. Depending on how a project is initiated (a prospective client generates a Request for Proposal, a stakeholder sends a request by email, and so on), it may be more or less based on one of the three. You may be asked Here’s the situation, how can research help us? Or We need to learn such-and-such about these users. Or Can we complete this method of research within this time frame? But no matter how the conversation begins, it’s up to you to fill in the rest of the pieces.

If you’re given a research question, ask why that information is needed. If you’re given a research method, ask what they hope to learn, and then ask why that information is needed. Sometimes, the people you’re going to work with haven’t thought about this, but often it’s just implicit and your questions will help make it explicit. You want to make sure that not only are you and the clients or stakeholders aligned, but crucially that these different pieces are in alignment: the method has to produce the information that is needed, and the information that is needed should be in support of the actions the team plans to take.

The people who need the results of the research don’t necessarily understand the range of methods and when to use them. Don’t agree to use a prescribed method that doesn’t align with the necessary results, because the blame will fall to you at the end when you can’t deliver. Facilitating the alignment between challenge, question, and method is part of the expertise a researcher brings. People who do research should seek an experienced researcher to advise on these high-level aspects of the research plan.

To Interview Well, One Must Study

Much of the technique of interviewing is based on one of your earliest developmental skills: asking questions (see Figure 1.6). You all know how to ask questions, but if you asked questions in interviews the way you ask questions in typical interactions, you would fall short. In a conversational setting, you are perhaps striving to talk at least 50 percent of the time, and mostly to talk about yourselves. But interviewing is not a social conversation. Falling back on your social defaults is going to get you into trouble!

Interviewing users involves a special set of skills. It takes work to develop these skills. The fact that it looks like an everyday act can actually make it harder to learn how to conduct a good interview because it’s easy to take false refuge in existing conversational approaches. Developing your interviewing skills is different than developing a technical skill (say, milkshake-machine recalibration) because you had nothing to fall back on when learning about milkshake machines. With interviewing, you may need to learn how to override something you already know. Think of other professionals who use verbal inquiry to succeed in their work: whether it is police officers interrogating a suspect, a lawyer cross-examining an opposing witness, or a reference librarian helping a patron, the verbal exchange is a deliberate, learned specialty that goes beyond what happens in everyday conversation. For you as an interviewer, it’s the same thing.

We’ll revisit improving as an interviewer in Chapter 7, “Better Interviews.”

Interviewing Users 2 Figure 1.6

Figure 1.6 Childhood is marked by frequent, inevitable question-asking.

The Impact of Interviewing

Interviewing creates a shared bonding experience, often a galvanizing one, for the product development team (which can include researchers, designers, engineers, marketers, product management, and beyond). In addition to the information you learn from people and the inspiration you gain from meeting them, there’s a whole other set of transformations you go through. You might call it empathy—say a more specific understanding of the experience and emotions of the customer—which might even be as simple as seeing “the user” or “the customer” as a real live person in all their glorious complexity. But what happens when people develop empathy for a series of individuals they might meet in interviews? They experience an increase in their overall capacity for empathy.

This evolution in how individual team members see themselves, their connection to their colleagues, their design work, and the world around them starts to drive shifts in the organizational culture (see Figure 1.7). This capacity for empathy is not sufficient to change a culture, but it is necessary.

Interviewing Users 2 Figure 1.7

Figure 1.7 Team experiences that are challenging and out-of-the-ordinary create goodwill and a common sense of purpose.

More tactically, these enlightened folks are better advocates for customers and better champions for the findings and implications of what has been learned in interviews.

The wonderful thing about these impacts is that they come for free (or nearly). Being deliberate in your efforts to interview users will pay tremendous dividends for your products, as well as the people who produce them.

Scope Growth

In a Twitter thread, Mollie Ruskin wrote about a civic design project, saying,

While the research was “about” operations and staff capacity and a complex process for answering heaps of emails, I quickly found we were stumbling over a set of questions fundamental to the function of our representative democracy.

So, as much as you work to identify and align on your business problem and your research questions, that alignment is limited by the fact that the only information you have comes from before you have done any research. Mollie reminds us that our understanding of the problem (and the opportunity) can change.

The worst thing a research team can do, however, is to come back to the project sponsors and say “Welp, we know we were looking at operations and capacity but really the issue is the underpinnings of our democracy.” Ideally, the broader team is collaborative enough that they will see these reframes together and can decide what to do about them. When I’m in this situation, I try to address the initial scope (“Here’s what we know about the gaps in the operations and how this impacts staff capacity”) and present the emergent topic as one that builds on the original goals (“and, the real issue that connects these infrastructure decisions is the very nature of our democratic processes.”). If the organization isn’t ready (yet) to address the larger insight (and often they won’t be—just look at the size of the shift in Mollie’s example!) at least they can move forward on their original problem, and you’ve planted the seed for a future effort. This probably won’t be the last time this underlying issue emerges, and at some point, it may not be possible to ignore it any longer.

The Last Word

It’s become increasingly common, perhaps even required, for companies to include user research in their design and development process. Among many different approaches to user research, interviewing (by whatever name you want to call it) is a deep dive into the lives of customers.

  • Interviewing can be used in combination with other techniques, such as identifying key themes through interviews and then validating them quantitatively in a subsequent study.
  • At a distance, interviewing looks just like the everyday act of talking to people, but interviewing well is a real skill that takes work to develop.
  • Interviewing can reveal new “frames” or models that flip the problem on its head. These new ways of looking at the problem are crucial to identifying new, innovative opportunities.
  • Interviewing can be used to help identify what could be designed, to help refine hypotheses about a possible solution that is being considered, or to guide the redesign of an existing product that is already in the marketplace.
  • Teams who share the experience of meeting their users are enlightened, aligned, and more empathetic.

Back to Interviewing Users (2nd Edition)

Frequently Asked Questions

These common questions and their short answers are taken from Christopher Noessel’s book Designing Agentive Technology: AI That Works for People. You can find longer answers to each in your copy of the book, either printed or digital version.

  1. How do you pronounce “agentive”?
    “Agentive” is a once-languishing adjective that is built on the word “agent,” so I pronounce it emphasizing the first syllable, “A-jen-tiv.” I like that this pronunciation points back to its root, which helps people suss out its meaning when they’re hearing it for the first time. I’ve heard people stress the second syllable, as “uh-JEN-tiv,” which rolls off the tongue just fine, but doesn’t do much to help people’s understanding.
  2. Did you invent this kind of technology?
    Oh no, far from it. As you’ll read in Chapter 4, “Six Takeaways from the History of Agentive Thinking,” thoughts about machines that take some sort of initiative go all the way back to at least ancient Greece. So, no, I didn’t invent it. I have designed several agentive systems over the past few years, though, and on about my third such project, realized I was seeing some recurring patterns (in the Christopher Alexander sense). I looked for a book on a user-centered approach to this kind of technology, and when I could not find one, decided to write it.
  3. What’s the most accessible example of agentive technology you can give me?
    Chapter 1, “The Thermostat That Evolved,” goes into some detail on one example that is popular in the United States, the Nest Thermo- stat. If you’re not in the U.S., or unfamiliar with that product, imagine an automatic pet feeder. It is not a tool for you to feed your cat. It has tools for you to specify how you want the machine to feed your cat, and the feeder does most of the rest. You will still need to refill it, free food stuck in its rotors, and occasionally customize or pause feeding schedules. These maintenance and customization touchpoints are what distinguishes it from automation and where design plays a major role. To flesh out this singular example, see Appendix B for a list of every other real-world example included in the book.
  4. I have an agentive project beginning. How can you help me start it out right?
    Begin with the first diagram shown in Appendix A, “Consolidated Touchpoints.” It shows common use cases in a rough, chronological order. Think through your product and identify which use cases apply to your project and which don’t. Reference the chapters in Part II for details on the use cases and begin to construct scenarios around them. This should give you a great head start.
  5. Why didn’t you go into depth about interfaces?
    Agentive technology differs primarily in use cases, rather than interfaces, so Part II is dedicated to identifying and describing these. Readers can draw on the existing practices of interaction and inter- face design for best practices around individual touchpoints. The notable exception is the interface by which a user speci es triggers and behaviors. See Chapter 5, “A Modified Frame for Interaction” for an introduction to these concepts, and Chapter 8, “Handling Exceptions,” for an interface pattern called a “Constrained Natural Language Builder,” which you can consider customizing in your agentive interfaces.
  6. You’re just another cheerleader for the
    future, blithely bringing artificial intelligence doom down on us all! Wake up, sheeple!

    Technically, that’s not a question, and frankly a little hyperbolic. But I’m still here to help. There’s a distinction to learn in Chapter 2, “Fait Accompli: Agentive Tech Is Here,” between narrow artificial intelligence and general artificial intelligence. Once you understand that difference, it becomes easier to understand that, unlike general AI, narrow AI gets safer as it gets smarter. And as you’ll read at the end of Chapter 12, “Utopia, Dystopia, and Cat Videos,” I believe a worldwide body of agentive rules is a useful data set to hand to a general AI if/when one comes online, to help it understand how humans like to be treated. This is on the good side of the fight.
  7. Aren’t you that sci-fi interfaces guy?
    I am one of them. I keep the blog scifiinterfaces.com, and you may have heard me speaking on the topic, attended a workshop, or been to one of my sci-fi movie nights. Also, Nathan Shedroff and I co- authored Make It So: Interaction Design Lessons from Science Fiction in 2012, which is all about what real-world designers can learn from speculative interfaces. Predictably, sci-fi makes appearances in this book. You’ll see some quick mentions in Chapter 2, and two important mentions in Chapter 13, “Your Mission, Should You Choose to Accept It.” These serve as a telling contrast of sci-fi written with and without agentive concepts. You also can search the #agentive tag on the sci interfaces.com blog to find even more.
  8. If you could wave your hands and make anything an agent, what would it be?
    Well, I must admit that part of the reason I chose Mr. McGregor to
    be the illustrative example is that I grew up in big cities, far from farmsteads, and never got the knack of raising plants. If, like me, you have a brown thumb, but dream of growing your own garden-fresh food, read about Mr. McGregor in sections placed after Chapters 5, 6, 7, and 8. My second choice might be an agent on mobile phones that listens in on conversations and does some socially adept fact-checking and frame-checking to encourage skeptical thinking and discourage lies or bullshit, in the H. G. Frankfurt sense.

Back to Designing Agentive Technology

Sample Chapter: Design for Learning

This is a sample chapter from Jenae Cohn and Michael Greer’s book Design for Learning: User Experience in Online Teaching and Learning. 2023, Rosenfeld Media.

Chapter 1

Learning Is an Experience

A group of fifteen people log on to a video conference call together. They are gathered to attend a change management training session, all logging in from different locations. Their faces float in their individual squares, arranged in a neat grid. One minute before the session begins, the grid of faces shifts to the side of the screen and is displaced by a screen-shared “welcome” slide. The facilitator for the session announces that the conversation will begin shortly.

But what’s called a “conversation” is not really a conversation. The facilitator talks for an hour as she progresses through a slideshow. The participants in their separate squares simply listen. Maybe some of them take notes. Others tune out.

Live online learning experiences like this are common, but they can often feel uninspired and, frankly, boring.

You could probably make your own list of online learning experiences gone wrong. The slide deck with tiny type that was almost unreadable. The endless blocks of dense informational text that had to be navigated in a pop-up window that looked like it was designed in the 1990s. A “next” button that can only be double-clicked for some reason. A workshop lacking clear organization and degenerating into chaos. Videos that lack captions. Seminars running over time—by an hour or more. The examples go on and on. But learning designers can do better than all of this.

Doing better means understanding that learning experiences can’t just be facilitated; they must be designed in ways that are attentive to an online user experience. Many online classes are designed simply to mimic the experience of in-person learning, largely because facilitators and instructors haven’t been given sufficient training or support in the theory and methods of online learning. Sadly, this lack of training and support has caused many learners and instructors alike to blame the online environment itself.

But it’s not the fact of being online that’s to blame for a crummy learning experience. It’s a lack of attention to what people’s experiences are like when they are online. It’s a gap between an understanding of user experience design and actual learning design.

Designing for learning means designing an environment where users have clear choices. It means creating a space where learners can find what they need in the way that they need it and feel supported all along the way.

Design for Possibility

Learning is often associated with a stodgy, formal environment, like a schoolroom with desks bolted to the chairs. That’s because learning, historically, is a lot about control: pour some ideas into learners’ minds, and they’ll come away with new knowledge. Paolo Freire famously critiqued this model, referring to it as the “banking model” of education, which assumes that learners are only there as vessels to receive and file away information (like depositing money in a bank).

But in the last two to three decades, nearly ubiquitous access to the internet and mobile devices has provided platforms that give learners a lot more agency and control in where, when, how, and why they might engage with a learning experience. And while the technology itself hasn’t disrupted “the banking model” of education, it certainly makes the deficits of the banking model all the more visible. You can try to force leaners online to watch a bunch of videos with no engagement or follow-up. Or you can attempt to keep learners still and silent while staring into a web camera. But you’re definitely not going to succeed. After all, it’s all too easy to get distracted and find new and interesting things to do online. Online, learners are no longer at the mercy of what a teacher tells them to do; instead, they get to navigate through their own experiences because the technology does not keep them confined to one place at a time. If you really want someone to learn something online, it’s important to keep them engaged and give them a reason why they should be there learning in the first place.

In today’s world, successful online learning experiences put learners in the driver’s seat. For example, Codecademy, an online learning platform founded in 2011, offers a large and growing catalog of courses in web design, machine learning, data science, and related subjects in coding languages and computer science. As of 2023, over 100,000 paid subscribers have used Codecademy to learn how to write code (see Figure 1.1).

Three columns appear side-by-side with the left-hand column showing “How a Form Works” with instructions, the middle column showing the code for generating a website, and the right-hand column showing the actual website, which reads “Davie’s Burgers at the top.

Figure 1.1
The Codecademy course enables users to see the how, what, and why of an activity all at once by showing three key pieces of information for a user learning HTML for the first time: a description and purpose for the activity, the terminal for writing the code, and the rendering of what the HTML code produces for the Web.

A typical lesson on Codecademy starts with a short introductory text explaining a concept or idea; in this case, how an HTML form works. Learners read through the explanation followed by step-by-step instructions describing the process to build a form. At each step, learners can click “Stuck? Get a hint.” A concept review provides a sample “cheat sheet” that can be used to review the main concepts in the lesson, and learners can also check the community forms to see what questions other learners asked about the lesson. In the center window, learners can run and troubleshoot their code in real time, and on the right, they can view the visual output produced by the code (a mockup for a fictional business called Dave’s Burgers).

Codecademy is one of many examples of digital learning platforms that have transformed the experience of learning in the past twenty years; others include LinkedIn Learning (formerly known as Lynda.com) created in 2002, Khan Academy in 2008, and Coursera in 2012. While these platforms have not replaced a lot of traditional learning experiences, they are designed in ways that give learners the agency to stop, start, pause, apply, and re-try new concepts without the time constraints of a formal learning experience.

A Brief History of Online Learning

The seeds of online learning experiences were planted by the internet in the 1980s. Concurrently (in 1984), Malcolm Knowles, an adult learning theorist, created a theory of “andragogy” or “adult learning” that posited four key principles as critical to helping adults learn new ideas: having a strong self-concept, having a reservoir of prior learning experiences to draw upon, having a readiness to learn, and having an orientation to what it means to learn. These four principles, he argued, needed to be applied to the growing world of online training experiences, including the integration of a clear stated purpose for the learning experience, a task-oriented way of organizing content, the inclusion of varied learning activities, and room for learner agency and direction. These kinds of principles set the stage for the continued growth of online learning in the 1990s.

By the 1990s, online courses began to emerge, mostly centered on college campuses, often in states with large rural populations, like Utah, where students would have to travel long distances to attend class in a brick-and-mortar classroom space. Online classes grew steadily throughout the early 2000s and by 2011 about one-third of U.S. college students were taking at least one course fully online.

The flexibility and ease of accessing online learning experiences has continued to be facilitated by the growth in consumer technologies that make information even more portable and convenient to access. In 2007, Apple introduced the first iPhone and launched what has become a thriving ecosystem of digital learning. This growth in consumer technology has made the prevalence of learning experiences online all the greater; learners have to learn how to use their iPhones in order to continue buying and engaging with iPhones. As such, programs such as the Google Analytics Academy, the Meta Community Manager certification, Hubspot Academy, and the Salesforce Trailhead program are all growing and thriving consumer technology education programs, in large part because they count on a growing consumer base remaining interested in becoming better users and learners on the tools of a persistently growing consumer technology ecosystem. Smartphones are now used by many more learners than laptops or other large screens. Even with this unprecedented growth in access to online learning experiences, the need for theories like Freire’s and Knowles’ persisted; increased access did not necessarily mean an increased understanding of how to develop an experience that would really, truly be meaningful to learners.

The need for an immediately accessible learning experience became even clearer at the peak of the worldwide Covid-19 pandemic in March 2020. This catastrophic event forced many people into online learning because of “lockdowns” that prevented people from gathering in brick-and-mortar spaces. From early 2020 through the end of 2021, learning experiences across industries were rapidly spun into remote experiences. It’s worth noting that the customer education industry, with companies such as Salesforce and Hubspot Academy leading the way, were leaders in developing online learning experiences prior to the pandemic. However, outside of the customer education industry, remote learning experiences were often considered “inferior” to an on-site class experience. But the experiences of emergency remote learning opened many trainers’ and teachers’ eyes to new possibilities for learning and gave a wider range of individuals more ubiquitous engagement with other learners.

The rapid rise of “emergency remote teaching” has been an earthquake in the lives of instructors and course designers, and the aftershocks continue today. There’s no unwinding the clock on the experiences that lots of students and professionals alike had with learning online, and the key for learning designers now is to consider how, with deliberate time and planning, the tools for online learning can be designed to be even more attentive to users’ needs. It’s easy to anticipate that online learning will continue to grow because of the following criteria:

  • Accessibility: People no longer need to travel to brick-and-mortar campuses or offices and can access learning from home or wherever they have access to a smartphone and a good internet. Plus, disabled learners have access to tools like closed captions and screen readers, which can ensure their access to the materials they need.
  • Social mobility: Many people have either been priced out of continued learning opportunities, such as enrolling in higher education courses, or have not found traditional continued learning opportunities to meet their needs as full-time workers or caregivers. There is a growing need for people to learn outside of formal, inflexible, and expensive channels.
  • Career growth: Learners today are often driven by a desire to change or advance in their careers. Technology drives many of these learners, who discover that they need specialized training to move up the career ladder.

This brief snapshot of the landscape of learning begins to explain why there is so much in motion now. Today’s learners are seeking ways to learn at their own pace and on their own schedules. They want to learn in their own ways, gravitating toward interactive experiences to test and practice their learning, rather than learning primarily from textbooks or a long lecture.

Learning designers and others who create learning experiences face both huge opportunities and big challenges.

Why Learners Today Are a Different Kind of User

Learning experience design reflects a growing body of work that combines user-experience design (UX) with learning science. A learner is a special kind of user, with their own needs and values. If you want to check an account balance using your mobile banking app, you can log in, look up the balance, and you’re done (if the app is working properly). If you want to learn how to design a website using semantic HTML, you need much more information. You need to learn how to learn.

Online learning platforms are complex information systems. Designing these systems draws upon several fields, including:

  • Information design (or information architecture): A process for designing how users move through complex systems. Information design makes information both findable and understandable.
  • Instructional design: A process for designing and developing learning experiences.
  • Learning science: Theories and practices developed from neuroscience, psychology, and education research to inform how people learn.
  • Visual design and UI: An understanding of how the visual layers of an online experience look and behave.
  • User research: Research to learn about learners’ needs and behaviors.
  • Content strategy: A process for imagining and planning the content across the product or experience.

Language from these fields informs the planning, sketching, prototyping, and production of digital learning experiences.

A Model of Learning Experience Design

Learners are not sponges who absorb information through osmosis. But many learning platforms are unconsciously based on a model that defines learning as information transfer. Even formal courses often present learning as expertise being delivered directly from instructor to student. Current work in learning science and related fields suggests, however, that learning is an active process. Learners do not absorb knowledge; they actively create new knowledge.

That’s why you need to design the learning experience with a specific learning experience design model in mind. The learning experience design model begins by centering on the learner. The learner’s interactions with the instructor, course website or platform, learning activities, and other course materials to support the learners’ practices combine to form the essential learning experience, as shown in Figure 1.2.

A circle with the label “Learner” contains an arrow that points in two directions: to the word “Learner” and to a cloud labeled “The Learning Experience,” which includes contributors to the experience: the Instructor, the Website or Platform, Learning Activities, and Practice.

Figure 1.2
This user-centered learning experience design model demonstrates how the learner’s behavior, motivation, and engagements with the course need to impact the course design just as much as the course design needs to impact and engage the learner.

This focus on the learner and their behavior in the experience design model happens not just through direct instructor-to-student feedback, but also through an automated framework. For example, in Duolingo, every time a user completes a lesson on the app, they receive automated acknowledgement of their progress and a score sheet detailing their performance, including the areas where they excelled and struggled. The next time the user logs in, they receive a customized set of lessons that are aligned with their progress report. So, if the user succeeds in lessons on using, say, modifiers in Spanish, they see lessons about modifiers less frequently. And if the user struggled with conjugating verbs in prior lessons, they see more lessons on verb conjugation, which will reinforce the user’s need for practice in this area.

In the case of the automated language learning app, the user is getting real-time and automatic adjustments (feedback) based on their performance in the activities. This process is often referred to as an individualized learning experience. This kind of automation is not possible in all learning contexts, but this model of design centers the user in their experience and offers them materials that are aligned with their needs, at least as far as the robot understands them. In Chapter 11, “Reviewing Your Learning Experience,” we’ll explore the benefits and limitations of using robots or other forms of artificial intelligence for feedback.

We’ve found that many instructors tend to overemphasize the content of their courses, seeking that one perfect reading or that one perfect example that will enable them to get their point across to learners. In contrast, the experience design model encourages designers to focus on what the learners are doing—or how they are using the content and why.

This doesn’t mean that content for learning can’t be designed in an interactive way. But interaction is not the same thing as learning.

Peloton’s instructional exercise content library is a perfect example. Peloton bike users get access to a novel range of exercise videos that they can play whenever they’d like, and they are incentivized to interact with those videos through several features. Exercisers can track their progress, earn badges for completing a certain number of workouts, and compete with other Peloton users on a live leaderboard to gamify their experience.

Where Peloton excels is by building a powerful community. Peloton users can identify “friends” to add to their accounts so that they can invite others to join rides with them and build accountability networks. This community helps users build strong motivation to take more classes and be in a community together. Most Peloton users don’t expect to become professional athletes by purchasing access to the Peloton app or buying a Peloton bike. They want to build a habit by investing in a commitment, not developing mastery.

When users ride on a Peloton bike, they are not receiving feedback about how effective their form is on the bike. Rather, they are receiving instructions and incentives to keep them exercising. There are videos to help them set up the bike and strive for better form, but short of remaining uninjured, they may never really know for certain if they’re riding the Peloton bike like a professional athlete. And that’s OK! That’s likely not their goal. But it’s worth understanding and appreciating the difference. Building a habit and a motivation to learn are part of a learning experience. But without getting direct feedback on how to improve, learning isn’t possible.

Interaction design and learning design are not the same thing.

If your end goal is to design training or a course that users will be incentivized to complete, then a gamified video playlist, like the Peloton app, may very well do the trick. However, if the end goal of your training or your course is for a learner to be able to apply that knowledge to a variety of flexible situations and gain new knowledge that you can assess for authentic learning, then your learning design is going to require some well-designed interventions.

A starting point for that intervention is remembering that you’re designing for users who have specific needs. Knowing what those needs are and how you can optimize your design to meet those needs is the goal.

It’s Not All About You

When you’ve given a presentation in front of a room, have you ever heard the advice to imagine that your audience is naked? It’s a piece of advice that’s meant to calm you down and make you laugh, but it’s also advice that’s communicating something important about presenting information: the more pressure you take off yourself to perform, the better.

Imagining your audience naked is about remembering that a presentation is not just about you. It’s just as much about how you feel about the audience you’re with as it is about how you’re feeling in your own skin. And while imagining the audience naked is not advice that works for everyone, it remains popular and well-known precisely because it communicates something learning designers need to remember when leading or designing a learning experience: it’s not all about you.

When you’re leading and designing something, you’re going to have an impact. But also remember how you’ve felt as a learner in a training or a class. You might have some vague memories of how the teacher or facilitator looked and acted. But what you probably remember most is how you felt taking that training or being in their class. Now that you’re in the position of facilitating or designing a learning experience online, it’s tempting to get hung up on how learners might perceive you or the content you’ve created. And while appearances do matter, they only really matter in one way: how those appearances impact how your learners will get what they need from the learning experience.

Seeing Learners in 3D

The idea that you are not your learners has an important corollary: You need to design for real people—people who are not you.

Part of designing in 3D is recognizing the reality that learners are more diverse than you might think. You are probably already working with learners who have disabilities, for example. In fact, you can assume that about one-fifth of your audience is using some form of assistive technology to access and engage your content, from captions to screen readers. You are also probably working with learners who speak more than one language, and who may never have experienced learning in a way that’s ever explicitly been designed for them.

Chances are, you are also designing for learners who are stressed and have other demands on what researchers call cognitive load. Designing with the goal of reducing cognitive load will make learning experiences easier for learners to process, absorb, and maintain their engagement.

Online learning requires an engagement of both mind and body. Designing for real people requires learning designers to understand and empathize with people who experience learning differently than they do across many dimensions.

Learning doesn’t just happen to people. It’s a designed experience that has defined goals and outcomes. If achieved, those goals and outcomes should evoke positive feelings for the learners. Defining how the goals and outcomes of the experience align with who the learners are is an important starting point for any learning experience design project.

The Learning Design Process

Learning design is an iterative rather than a linear process. There are starting places for the learning designer, but those starting places may need to be returned to repeatedly to ensure that the vision for the learning experience is clear and that the needs of the learners are met. A learning designer may need to circle back, repeat steps, and return to earlier parts of the process multiple times (see Figure 1.3).

The steps of the learning design process are as follows:

  • Learn more about who the learners are.
  • Identify the main problem to solve for the learners.
  • Define an endpoint: a vision for the learners at the end of the experience.
  • Create a list of learning goals.
  • Build a learning map around your list of learning goals.

The diagram shows four circles that intersect: “Identify the Problem,” “Define an Endpoint,” “Create a List of Learning Goals,” “Build a Learning Map,” and in the center of all of them is “Learn about the Learners.”

Figure 1.3
The learning design process illustrates the iterative nature of designing a learning experience, all while keeping learners’ needs at the center of the design thinking process.

As a content designer you should assume that online learners will not follow the path you have laid out. Therefore, you must make it possible for them to determine their own ways through the content. The best way to understand how your learners will navigate your material is to build feedback loops and spaces for reflection and evaluation into your course or learning experience. (See Chapter 10, “Giving Your Learners Feedback.”)

This learning design model is the foundation for launching an online course experience that keeps learner-centered needs in mind all the way through.

Takeaways

  • Build a full experience, not just content. A meaningful learning experience is developed based not just about what people are learning, but how they are learning.
  • Design with the possibilities of engaging online in mind. Don’t try to resist the multiple ways that learners can engage online. Lean into the options and design for experiences that can be accessed in multiple ways.
  • Design with purpose. Know why your learners are engaging with your learning experience and remain aligned with that purpose as you begin your design process.
  • Keep diverse learner needs in mind. Embrace the fact that learners will have different needs for your courses and anticipate what those needs are as you move forward.
  • Learning design is an iterative process. There are steps you can follow from start-to-finish when engaging with the learning design process, but bear in mind that you may need to revisit and re-engage with those steps as you develop your learning experience.

back to Design for Learning

Rosenfeld’s Winter Sale

Enter the Rosenverse