Research Operations at Scale
Christian Rohrer, VP of Design, Research & Enterprise Services at Capital One
Submitted by Natalie Hanson (Official Tripnoter)
From the Design Operations Summit website:
Research is an inherent part of design, and operationalizing how insights are generated and used is one of the more challenging aspects of design operations. Overcoming the lack of good tools, taming the complexity of research at scale, and navigating the world of different forms of inquiry are just a few of the obstacles to overcome. That said, we are starting to see a range of benefits from implementing research operations in modern design organizations.
Christian leads user research efforts and is the4 head of design for identify, security, Captal Design wallet. He was at Intel security and McAfee prior to that. He was also at Yahoo and eBay. He has a PHD in symbolic systems education and a BA in computer science.
Capital One is known as a credit card company, but it’s really a bank. It’s one of the tenth largest banks in the US. The primary goal is to change banking for good – both now, and in the future. It’s really about technology and being digital. We have started to become a tech company that happens to do banking. Do that we’re going to need human-centered design. The company realized it needed to be a corporate strategy. For us that means lots of research. There are questions of efficiency, minimizing risk, following regulations. They are one of the most regulated entities there are. So that causes some problems. “Research goo”, he calls it.
The #1 rule when people file from bankruptcy is that you can’t contact them. So how do you study them? They had to recruit people who were bankrupt but not customers.
They have established a certain set of roles. The team he joined two years ago had already put this in place. They have specialized roles in research operations, and research curricula:
The research curricula team is to create courses to teach about specialized methods and procedures. There are three major benefits:
- We can deal with risk and regulation in advance because personnel are trained. Risk Management at Capital One is great – they can apply it to any situation. There is a person that is assigned to a given team, and they are called Risk Sherpas because they carry the team through the risk process.
- It allow them to remove other “goo” like dealing with recruiting.
- And then qualitative efforts can be done as an event like Wine & Dine. They are able to create events that they can have small and large group interactions, and allow a high degree of interaction with employees. These events become a great source of information for things like journey maps, and customers feel really heard.
Should you do this? It really depends on the context. That works well for their employee base and their customer type. Ultimately, Research Operations must address challenges from three key sources:
- The context of your business or industry
- Recent trends in research
- Universal challenges inherent in research
Business or industry context. Are you B2C, B2B, or IT / Internal? The implications for typical research operations will vary based on the context. The choices that users can make are quite different, e.g. consumer choice. Access to the customer or end user also varies significantly based on that context. You have to find champions who are willing to let you conduct research and shows how it can deepen the relationships with the client.
If you’re a start-up, you may be experimenting with ideas. In thinking about the diffusion of innovation curve, who is your user today? What about tomorrow’s users? They will be quite different. Are you getting direct access? Do you have monthly, quantitative data about your users? How many have monthly qualitative access to users? We need to do both qualitative and quantitative research.
He wants to talk about recent trends and implications for research operations. Agile is here and you need to deal with it. His chart shows the growth of Agile in the past 10 years:
As an engineering methodology it’s here to stay in some form. The reason why is that no Engineering manager will have a job description that says Waterfall – they won’t get the talent. That means UCD has to fit into Agile in some way. Agile may be implemented without critical thinking, and it can create challenges. UX roles aren’t well defined in Agile. There are not just one to two different ways this gets one well – research inside a Sprint is tough. Sometime there is a ‘need to feed’ and we’re scrambling to have work ready for developers. But more broadly, Agile creates a developer-centric view of the world. So, we have to reconcile processes. All of these are human-centered, just for very different humans! Agile is organized around the product owner, and in UX we are focused on design practices and users.
There are two approaches:
- Use Agile methodoogy but ina . separate design phase
- Use an equally attractive, similar, more compatible processes like Lean
The team he was part of at Intel embraced Agile fully. They had a leader that used the Agile tools, and followed the exact same process for their UX work. The had a groomed backlog, they included research and design, story points, etc. That worked well for them, even with teams that were not co-located.
The second approach – an equally attractive alternative – is another option. A core Lean tenet is build-measure-learn, which is compatible with user-centered design. But he has found that people think ‘build’ means creating an MVP, and he has had to re-educate them. We can build ideas and concepts, first – and determine whether it’s viable, desirable. And we could even do a prototype that could be tested as well. It’s still data, still learning. Then we can create a product, measure usage. He prefers the term ‘study’ to ‘measure’ to ensure qual and quant are both acknowledged as valuable for learning.
The idea that research is owned by any one team any more is not realistic.
Tomer Sharon wrote a book called It’s Our Research about getting stakeholder buy-in through doing user research together. That means we need to teach people how to do it, not do it for them. How do we remove barriers to make that happen, while managing risk and other “research goo”?
Universal Challenges in Research Operations
Let’s acknowledge the five purposes of research:
- Understanding people and problems
- Inspire creativity and innovation
- Validate concepts and solutions
- Improve and inform esigns
- Assess and measure experience
To accomplish all these things, you need different skills, and these are not always found in the same person. The unicorns are rare, and many people don’t want to do it all. You need to allow for both specialists and hybrid players on your team. We need T-shaped people, and I-shaped people, too (where people that can go deep and long in their job families). There is also an inverted L, and that is for folks that want to talk the orthogonal turn to the dark side of leadership. 🙂
The talent pool is small, and it’s growing slowly. Universities produce 1000s of developers, but only about 75 universities produce UX professionals, globally. The demand and supply is out of balance – so it will always be tight. We need to optimize the use of our research and design teams.
What methods work best for designers? The broad answer is that they tend to be inspired by qualitative research, where they can interact with end-users. They can have access to the root cause of certain issues like strange workarounds. That will inspire them, they get something out of that. It taps into something in them – the qualitative validity of the insight.
What works best for executives? Quantitative. They are not looking down, they are looking up, and selling up. PM sells to the executive, executive to the CEO / Board, and the Board to the street. They can’t defend decisions with qualitative data. Please consider that if you have qualitative data, your shouldn’t put numbers out there, or you are teaching people the wrong thing – and they will discredit what you do. You have to focus on the why.
We have to acknowledge that metrics matter.
But these are not experience metrics, they are product performance metrics. If you measure product experience, you want to see what is going on between user and screen, not behind / with the product. That is what you want to be measuring.
He has created a combined model of user experience, which is his model plus Forrester. From outside in:
Look & feel invites them to engage. Sound is clear and relevant language, content. For the user this is their perspective (not terms like IA). These are layers of appeal. The next layer is Ease. And then Effectiveness, and then finally there is an Emotional Outcome. It’s toughest to get to the stuff in the middle.
He uses the PURE Method for easy of use.
He co-authored it with Jeff Sauro, and they have been using it at Intuit and now at Capital One. It’s a measure of cognitive load. To do it you have (1) declare your target user, (2) the 10-20 tasks that you must be able to do in this product, and (3) what is the best way this task can be performed. If you have these three questions answered, you’d be pretty far along. Lower numbers are better, green is good. This rates each step of a task. A rubric defines level of friction.
This is an actual scorecard for one product at Alpha, Beta, and Release.No executive likes to see red, so this can drive action. And this is helpful because it shows systematic improvement over time. And you can do it with competitors, too:Using this approach, two things happen – executives want to make change, or they want more robust data.
Knowledge Management is hard. Saving that information outside of our brains is difficult. A system would need to meet certain requirements. Polaris is a breakthrough in this space – you can see how WeWork does it in a workshop tomorrow. It’s not just about storing documents. But we want to find a system that stores how knowledge is created and used.
A question leads to research, and then you have a series of findings, and then you come to a conclusion. That leads you to a better idea, a hypothesis that fails, and you have to pivot. You test a new idea that gets tested and works well. Now you have to refine it, and eventually something is built and released.
The purpose of the research team at Capital One:
Opportunities lies in understanding the complexity of people. This requires deep expertise, interpretation, partnership, and empowerment. And ultimately, we help bring insights into design.