Research-for-hire — market research, UX research, and customer research — are in peril.
One of the major reasons is that they are positioned wrong. These functions are positioned to “help the business uncover customer needs,” or “deliver insights to the business.”
They promise businesses MORE data.
They promise businesses BETTER data.
They promise businesses FASTER access to data.
But they don’t promise the one thing that businesses really want. And it isn’t what you think.
The truth is, businesses don’t want data. They don’t even want insights. They want assurance that they are making the best possible decision. In short, they want to mitigate risk. They want an expert to diagnose a problem and offer a solution.
When you think about this, it really makes sense.
When you go to the doctor, do you want just the lab results, i.e. your pH is 7.45 (data)?
Do you want your doctor to tell you that your blood is alkalinic (insights)?
Or do you want your doctor to tell you what is wrong and offer the treatment plan for you, i.e. (advisor and advocate)?
This problematic positioning of research in a business context is the result of the word “research.”
What do you think of when you think of academic research?
A few years ago, I was building a research team and function for Twitter’s Developer and Enterprise Solutions (DES) group. In writing the Charter for the group, I submitted to the Head of Product for the group that a primary function of research was risk mitigation.
Within minutes, he responded, “I really want to discuss this more with you Ari. This is a surprising conclusion to me.” He continued, “I would have imagined something like, ‘It is a primary research function to deliver insights that ensure the solutions that engineering, product, and design brings to customers, meets their needs.’” It was interesting to me to see such a visceral response to my definition of research; until I considered that his understanding of applied research was grounded in his understanding of academic research.
He saw research in terms of the data and information brought to the business for the respective business areas to make decisions. He viewed research as a support function; not as an advisory function. This distinction makes all the difference.
The reality is, people learn new things by making connections to things they already know. Consider a time when you were introduced to something new. How did you evaluate it? When my daughter first showed me a photo of the hoverboard she wanted, this is how I contextualized it:
- It is kind of like a Segway without a platform or handles.
- It is kind of like her skateboard, but automatic.
- It carries a charge, so it isn’t gas-powered like my lawnmower.
All of these like statements are a point of reference to understand what things are, by comparing them to other things I already know and recognize. This semantic categorization allows people to quickly make sense of the world. This is a good thing. However, once an understanding is set, it can be difficult to change.
The Head of Product at Twitter for DES didn’t see research as a risk mitigation tool (even though that’s exactly what he described); but rather saw it as a way to evaluate things after the fact. In his mind, research shouldn’t be advising; rather it should be about gathering facts and reporting to other business units, i.e. engineering, product, and design. He was thinking about research in academic terms.
Over time, this also precisely what happened to other research-for-hire disciplines. When these functions were developed, they were conceived and described in the context of “academic research.” Market research is like academic research, but for businesses. UX research is like academic research, by explores how people interact with technology.
In a lot of ways, this contextualization was smart. It implied rigor. It implied integrity. It implied intelligence. All good things. But when thinking about a business’s reasons “why” they want research in the first place, it also brought some baggage:
- Academic research takes a ton of time; not ideal when you need to make timely business decisions.
- Academic research isn’t conducted in “real-world” conditions (sometimes); this leaves the door open to, “does this kind of inquiry even work in the ‘real world?’”.
- Academic research is not agile; once a study is started, it is like a slow-moving ship that is hard to change direction; not ideal when halfway through a study and you start seeing “signals” that your original thinking was off.
- Most importantly, academic research doesn’t prescribe a specific, measurable, actionable, and relevant course of action. I am certain that some academics that read this will bristle and say, “we have recommendations in academia,” but I would argue that those recommendations are broad and lack the focus needed to move a business forward, i.e. they aren’t specific, measurable, actionable, and relevant.
The reality is, in the real world, where there are innumerable variables, researchers look a lot less like academics, and a lot more like detectives.
Detectives follow clues to solve mysteries.
I have spent the better part of my life solving mysteries. First as a police detective. Then as an academic. Finally, in a career as an applied researcher, working with global brands that include Twitter and Panasonic.
Solving mysteries is essentially conducting research; albeit under a different name.
At their core, both solving mysteries and research are systematic inquiry. You want to know more about a particular topic, so you go through a process to increase your knowledge. Fundamentally, you want to know how something works or why something happened. The type of process depends on who you are, what you need to know, and what you are going to do with the information or data you uncover.
For example, if you want to know which Beatle was in The Travelling Wilburys, you might turn to Google (it’s George Harrison by the way).
If you are a detective investigating a homicide, you will turn to forensics to examine the crime scene; interviews with suspects, victim, and witness to build a story; and other rigorous tools that help you meet the burden of proof “beyond a reasonable doubt.”
In academia, you conduct research to create new human knowledge, whether to uncover new facts or fundamental principles. It is generally based on observation or experimentation with the goal of advancing the body of knowledge in a particular field. At the culmination of a study, the results of this research are published in peer-reviewed journals, designed to apply rigorous standards and methodologies to preserve objectivity and ensure the credibility of conclusions.
From the quick Google search to criminal investigations to academic inuqires—all of these examples meet the definition of “research.”
The problem isn’t with the explicit definition of the word research, but how people have come to develop a contextualized understanding of the word, especially as it applies to solving real-world problems.
Throughout your formal education, you were conditioned to look up facts, write papers, and present information under the flag of research. It was research for the sake of research. No action was expected beyond presenting the data and information. If you followed the pattern, you were rewarded with a good grade. If you failed to follow the research “process,” failed to properly cite your sources, or otherwise fail to present what you learned in a neat little package, you were punished. This is operant conditioning 101 — punishments and rewards.
So naturally, any contextualized connection to research would align with that, i.e. gathering information and presenting it in a specific and prescribed manner.
The very nature of research within this context suggests stasis, represented by the litany of academic journals, books, libraries, and other data that sits on the shelves of libraries.
This is where we should make the applied research distinction.
The major distinction between research in an applied setting and academic research should be the conversion of insights into action. This is the bridge between conducting an investigation and prescribing or taking action on the findings. Applied research is less outcome-focused, and more process-focused. Detectives don’t research cases just to write a report. They investigate them to bring perpetrators to justice.
In business, knowing something isn’t enough; taking action is table stakes. This is where applied research comes in.
We are men of action; word’s don’t become us. — Inigo Montoya
Applied research borrows ideas and techniques from academic research to serve a specific real-world goal, such as reducing traffic congestion, improving the quality of hospital care, or finding ways to market a specific product. There should never be a time where applied researchers deliver a report only to have that data collecting “cyber” dust buried on some hard drive, ne’er to see the light of day again.
There are striking parallels between good applied research and good detective work.
- Both seek to establish the truth through a trail of evidence with the aim to arrive at a solution.
- Both are potentially high-stakes.
- Most importantly, both operate in a world of uncertainty and must revise existing predictions or theories given new or additional evidence.
Applied research is successful to the extent that it contributes to a stated business goal or objective. The process doesn’t end when you’ve gathered your data and wrote your report; it ends when you have converted insights into action. In other words, it doesn’t end when you have solved your mystery.
Insights professionals need to embrace a paradigm shift.
Applied researchers need to stop appropriating the term research. It doesn’t capture what we actually do. We are insights professionals.
- We don’t deliver data.
- We don’t deliver insights.
- We don’t do research for the sake of research.
Businesses need us for more. Our ability to critically analyze and synthesize data makes us uniquely qualified to advise and advocate on behalf of the business. When a business has questions about the direction of the data, who should they turn to? When a business has questions about the strength of the data, who should they turn to? When a business is confronted with competing data, who should help them vet it?
Businesses often think of research in terms of this qualitative-quantitative dichotomy. They ask questions like, when should we do focus groups vs. when should we do interviews vs. when should we do surveys? This is all wrong. They are thinking about tools and tactics; not about the problem they are trying to solve.
It is incumbent upon us to be the problem solvers.
We are sense-makers. We make sense of disjointed, disparate, and fractured customer, market, business, and technical data and make actionable recommendations to the business. We are experts in applying research methodologies to solving business problems. We are advisors who deliver counsel. We are advocates for the business and are fiercely protective them.
If we don’t accept this new premise, we will be relegated to doing research in terms of the expectations of academic research, i.e. select a topic, conduct a literature review, write a paper, cite sources, present information, etc.
We will be relegated to taking orders and gathering data—leaving the sense-making to those who are less qualified. This hurts businesses. They will make bad decisions. They will analyze data improperly. If we truly believe insights functions are specialized and necessary, it is wrong for us to not own and advocate for the expertise. It is our ethical responsibility.
Remember, our stakeholders have enough formal education to have been exposed to the “traditional” research model. They are anchoring their expectations in what research is — and isn’t — in this experience.
Applied research can be very powerful if applied thoughtfully. It mitigates risk by saving time, effort, energy, and money by offering businesses assurance that their decisions are grounded in real-world “facts.” When done correctly, it helps businesses determine that they’re solving the right problem, uncovering competitive risks and advantages, identifying small changes with a huge potential influence, and identifying blind spots and biases that are preventing them from achieving their business objectives.
Owning insights is in a name—and a mission.
Using the rationale above, I built an insight function at Panasonic that brought together market research, CX, UX research, data science, and quantitative research. The division is called Analytics, Research, and Insights. Our Charter is to enable the business to make more-informed, and therefore less risky, business decisions while ensuring our customers remain at the center of our decision-making process. We execute on this by representing the Voice of the Customer — both internal and external — and serving as the single source of truth for customer, business, technical, and market data for Cirrus.
We own insights for the business.
As an insights professional, you should too.