#145 - Kick-Start Creativity Using Desk Research with Victoria Sakal of Wonder

#145 - Kick-Start Creativity Using Desk Research with Victoria Sakal of Wonder

Victoria Sakal [00:00:00]:
Yeah, you raised a good point, though. With primary research, I think another super essential component of it, which will always and has always been the moat for primary, is basically that it boils down to questions that haven't been asked already, whereas with secondary, the whole premise is someone has asked the question, maybe not, you know, gotten a direct answer, and it, maybe it can be manipulated to answer your question. But when you think about these forward thinking innovations or launches or features, if you have somebody having already reacted to or answered, it's actually not innovative. It might be helpful for a certain use case, but anticipating or kind of ideating is even the best spot is going to average together what is known in the corpus of information that exists, and it's going to have a hard time wrapping itself around. We know a recession is coming, or, you know, I'm aging and the dynamics of aging are changing. Whatever that is, a human will have that context, and I'm skeptical if it ever will.

Erin May [00:00:58]:
Hey, this is Erin May.

Carol Guest [00:00:59]:
And this is Carol Guest.

Erin May [00:01:01]:
And this is awkward silences. Awkward silences is brought to you by user interviews, the fastest way to recruit targeted, high quality participants for any kind of research. So fun to have. Victoria, from wonder on today to talk about desk research. Spoiler alert, googling is desk research. So we all do it all the time, but some great pointers on doing better desk research to folding it into your triangulation with primary and other forms of research. And we get into AI and speculation about the future, and it's just a really fun episode about things we do every single day and how to do them better as part of our research stack. So hope you enjoy.

Erin May [00:01:52]:
Hello, everybody, and welcome back to awkward silences. Today we're here with Victoria Sakal. She's the head of growth at Wonder. And today we're going to be talking about desk research, which is something that we haven't really talked about for a whole episode. So I think this will be really interesting to both folks who do a lot of desk research and maybe folks who do very little or none or don't even know what it is. So really excited to have you here to be talking about desk research, Victoria.

Victoria Sakal [00:02:17]:
Yeah, excited to be here. It's kind of the dark horse of the research world, so I'm excited to shed some light on it.

Erin May [00:02:22]:
Ah, yeah.

Erin May [00:02:23]:
Good stuff. Good stuff. All right, so let's start from the beginning. Let's not overcomplicate things. What is desk research?

Victoria Sakal [00:02:29]:
So desk research, at its core, is mining anything that's publicly available that's really as simple as it is sometimes called, secondary research because it's not about talking directly to your exact user, which is primary research. But that is what desk research is.

Erin May [00:02:45]:

Erin May [00:02:45]:
So is secondary research and desk research synonymous or.

Victoria Sakal [00:02:49]:
Yeah, exactly. And maybe to paint the picture of the broader research landscape, here you have your primary research. So that's going either through quantitative surveys or qualitative conversations to your actual user base. Sometimes an expert networks can be a little bit of hairy zone in terms of whether it's actually primary or secondary because it's not usually your buyer you're talking to, but primary is kind of, you're hearing it direct, straight from the horse's mouth. You have this other tier of sort of behavioral or social insight. So they're not exactly answering a question or talking about you to answer exactly what you're looking to learn from them, but you're watching what they're doing or you're monitoring what they're saying on social platforms or otherwise. And then you have secondary or death, which is sort of this foundation underneath everything, which is what people are doing. Could be mining different data sets, it could be reading other people's research reports.

Victoria Sakal [00:03:43]:
But the idea is it exists somewhere, whether in your organization or outside on the Internet. In the old days it was in.

Erin May [00:03:50]:
Books, but it exists.

Victoria Sakal [00:03:51]:
And you can kind of use that as a foundation for most of your research. So that's where it tends to get interesting in that some people are unfamiliar with the language desk or secondary. Most people do some version of it. If you use Google for forget about like which Thai restaurant to go to dinner, but also, you know, what's the competitive landscape or what's the latest on trend XYZ? You've done some form of desk research, but it actually ends up being a lot more of professionals time than sometimes we care to admit. But also then we talk about, like, we talk about the formal, costly, you know, six month push for this many interviews or survey respondents. But we don't tend to talk about the due diligence we tend to do that forms the foundation of decisions or research efforts or strategy plans and the like.

Erin May [00:04:33]:
Yeah, awesome.

Erin May [00:04:34]:
You make a great point that sort of one person's primary research can be another person's secondary or desk research. So mining that research that was done maybe months or even years ago in your organization to see if it's relevant now can be actually a great form of desk research.

Erin May [00:04:48]:
Yeah, absolutely.

Victoria Sakal [00:04:49]:
And I'm sure we'll get into some of the trade offs when you think about these different types of research with desk you tend to find it's not going to be perfectly relevant. So, for example, you might have a question around a new market that you're thinking of entering with an existing product. You might have a lot of historical context around why that product was introduced in North America, but not in this new market. And so there's that relevance factor and then sometimes a recency factor as well, especially with the speed and pace of research being conducted and published today. There's trade offs with the kind of that we think about the margin of error, the level of fidelity that you need with different types of research that you're using, but also the different stage you're at with the decisions you're making or the plans you're setting. So usually it's fine. You can get by and get knowledgeable enough to be dangerous with desk research, but there's always that nuance of it's not exactly the answer you need most of the time.

Erin May [00:05:41]:
Sometimes it is, but most of the.

Victoria Sakal [00:05:42]:
Times you have to kind of layer on your own kind of follow up to that finding or data point or whatever it is.

Erin May [00:05:48]:
Right? Yeah.

Erin May [00:05:48]:
So I'm gathering it might be a good method to kind of put in your stack alongside some other methods if you want to triangulate or, you know, find a fuller picture.

Erin May [00:05:57]:

Victoria Sakal [00:05:57]:
So it tends to be pretty powerful. When you need sort of a state of the union, you want to take stock of any information that might have been collected, get a first kind of dip into the state of consumers or the category or different trends. We like to think about it. When you have types of information that you have and types of questions that you need answered, you sort of have your no knowns. The things you feel pretty good about, you don't necessarily need to spend more time on. Although desk research can also be helpful for corroborating and just, you know, providing that last level of confidence that you've got your knowns down, you've got your known unknowns. So these are the questions that you're probably about to spend more time or money or, you know, budget, whatever, investing and getting answered. And then there's these unknown knowns.

Victoria Sakal [00:06:39]:
So you might have this information in your organization that tends to be mining behavioral data or, you know, spend or category data, that type of a thing, and then your unknown unknowns. And so that last piece is also pretty powerful for desk research, where you may need to triangulate some things and, you know, augment any of your known information with some of these additional pieces or kind of almost spotlights on the gaps in your knowledge. But we spend a lot of time thinking about how do we help people surface? What they don't know. They think they're going into a research project. They think they have one question to get answered, which might still be true, but there might be all these different elements of it that you also want to surface. And so we tend to think of it as a tip of the spear. It tends to be, if you do it the right way, have the right tools, which obviously we're biased, but we're one of them. If you have the right tools, it can be pretty quick and easy.

Victoria Sakal [00:07:30]:
It doesn't have to be a big slow delay to your effort or your work plan, but get up to speed on what might exist already, get some of those kind of proxies for information that you might be looking to double down on, and then you can go into a more formal research effort with a lot more context than the problem.

Erin May [00:07:46]:
You'Re trying to solve, but also a.

Victoria Sakal [00:07:48]:
Better sense of what are you actually trying to answer and knowledge gaps you're trying to close.

Erin May [00:07:52]:
Yeah, that's a great point.

Erin May [00:07:53]:
And so do folks often, and like you mentioned, people are doing desk research, whether they call it that all the time in their daily work lives. Do you recommend kind of jumping in and maybe doing a little informal desk research before even putting together a research plan often? Or is that a piece in your existing plan and you kind of want to make sure you can contextualize how it's going to fit into a larger plan?

Victoria Sakal [00:08:13]:
Yeah, I think it's a both situation. The first thing I'll always like guide clients with is think about your constraints. So if you're time strapped, your budget constraint, and you just need to get quick answers, sometimes that means jump right into the survey. Sometimes that means actually spend the extra couple hours doing the due diligence to get something in front of your CEO or whatever it is faster. There's also an element of what level of accuracy and precision do you need in the information that you're sourcing to make a decision or recommend an approach, whatever it is. Again, desk research might not be the most precise, but it will be quick and it can be close enough in some cases. Other times, primary research and having those direct quotes from what people are experiencing or needing can be what you need. But there's typically an element of is it helpful to get take stock of what you already might know? Is there an element of we need to have something that's directly tied to our audience or specific to our business to take those answers back to whatever your stakeholders might be looking for to your own team, that type of thing.

Erin May [00:09:16]:
Great, great. Cool.

Erin May [00:09:17]:
So let's talk about some more specifics. So what are some tools that you might use to answer different kinds of questions to do different kinds of desk research?

Victoria Sakal [00:09:26]:
Yeah, so typically the landscape breaks down into Google and search engines, which a lot of us are intimately familiar with. And now lately there are a lot of llms. So you can turn to the generic llms like GPT, Bard, now Gemini, Claude, the like. There's also a whole category of AI powered research tools we've kind of pulled together and mapped that landscape to understand. Are they uniquely AI tools? Are they tools that used to be for something else that they've sort of layered in different AI and LLM components? Some of them are purpose built for, for example, mining PDF's. We talked about other research that might have been published online. If you've got a PDF of a report, you can run it through and get the full kind of download. In other cases they're free and you're getting a trade off in quality there.

Victoria Sakal [00:10:12]:
In other cases you're going to pay for a subscription but get more kind of robust citations or roundups of information. But really it boils down to Google. You can sometimes pay a outsourced freelancer or contractor to do this research, but they're ultimately going to use Google or llms. And then, you know, llms is kind of the current state. And so the last sort of subcategory of that would be with a company like wonder where we're combining LLMs and the AI horsepower with humans who actually use all these tools but are trained up on not only boolean search and all the search engine type of skillsets, but also plugins, prompt engineering and all the skills that are super important for getting the most out of LLMs, which is tending to be quite a time suck for a lot of professionals who are trying to stay on top of the latest trends and the tools and the technology and get the answers that they need and do their day job. So that's kind of the landscape. It's actually pretty, pretty simple.

Erin May [00:11:06]:
Awesome. Yeah, it's simple at a high level. And then once you get into it. So you're starting with, you have some questions, like to use your two by two, you've got some known unknowns that you want to dig into those and I can imagine sort of list them out. You have an outline of questions, right? And then you're going to either Google it llmit or work with a service like wonder and get some answers to these questions. This is where there's no shortage of information in the world, at your desk or otherwise. What are some methods one might use to dig through this information, synthesize it, fact check it, make sure this is good information, and then get it into a write synthesized form that we can actually use to distill the answers to some of these big questions.

Victoria Sakal [00:11:52]:
Yeah, we actually think about it in terms of a knowledge assembly line, if you will. The first stage that tends to take people most time right now is rounding up all of this information. So I'm sure we can all resonate with spending hours either in the doldrums of Google or on page 14. You end up with a document with a bunch of links. You need to then click into the links, read the things. You go down a rabbit hole of more links. Sometimes you need to refine your search and tighten whether it's geography or the nature of the question that you're asking. So this first step is sort of collecting and accruing all of the types of responses or links or resources that you might have at your disposal.

Victoria Sakal [00:12:30]:
Of course, I'm not mentioning the upstream, there's sort of refining your thinking around what's the question I'm trying to answer and you know, what are my objectives? That kind of goes without saying, but once you've got this long working raw input list to your point, there's typically an element of verification and that can include, is this actually a link? As we'll see with llms, you'll sometimes get hallucinated URL's or totally made up bullets that you need to go like confirm. Is this even something that happened or something that someone said? On the Google side you sometimes need to verify are the links that I'm getting actually something I would trust? Would I forward this onto my CEO? Is it a one off blogger who had a n of one experience and is claiming that that's the future of whatever the topic is? So there's the kind of credibility and the verification in that sense. From there there's usually a prioritization or an organization to say this fits in theme one, or this is finding two. So kind of doing all that thinking around, how do you organize the information, but all tying back to your original objectives, which over time you might also be finding. There's part one, part two, part three that as you go down these rabbit holes, there's some findings that you're surfacing and then there's the element of I didn't fully get this answer. So how should I go about doing that? Is there a triangulation to your point earlier? Might I intersect a few different data points or connect a few of the different findings to find something close enough? Do I hold on this and address this in quant or other forms of primary research? And then there's the whole element of like storytelling, and this is a huge part of it that becomes a challenge when you start with these dumps of links. You've got this big working document and you have to share it with someone, probably even if it's just saving it for your own findings. I tend to, when I'm collecting links on things, I kind of save them in messy docs, but I'm never comfortable sharing my own working documents with my stakeholders.

Victoria Sakal [00:14:21]:
And so there's elements of is this written report, is this a PowerPoint? Is it something more robust and dynamic? When you think about like a market map, you might actually want something that you can sort and filter by. That piece can tend to be pretty heavy and intense as well. So again, we think about sort of your knowledge assembly line going from your initial question, collecting all your raw inputs, organizing into sort of like working themes or topics, and then figuring out how you share out and present. And then you get into the now I analyze, and what do I do with this information and what are the recommendations I am basing off of it?

Erin May [00:14:55]:
Right. Is it useful to begin with an artifact, a medium, an output you're going to create in mind at the beginning, or should the research as it comes together kind of guide the right format to summarize this information?

Victoria Sakal [00:15:09]:
Yeah, it's a great question. We think about it in a couple different ways. The first part is that the answer is both and that you want a sense of something that you're trying to hone in on, but being open to and pulling the threads. And the second piece is the types of questions that you ask. And so we talk a lot about. There's roughly five or so types of questions sometimes, and we see this vary based on who the question asker is, both in terms of their function as well as their seniority. But sometimes you want early signals and guidance around a topic. So that would be more of like an exploring question that, you know, tell me the state of hot sauces in italian food.

Victoria Sakal [00:15:45]:
Just give me any of the things that you can find. Sometimes you want to chisel and you want to understand all the different angles that might be relating to a different topic. So when we think about, for example, I just ran research on research repositories as a space. What are all the different ways companies are approaching this? Because it's not just one. There's a few different types of solutions, there's a few different subcategories. And so I want to paint actually a full picture and sort of chisel away at my original topic and get to something more finite and discreetly clear. Anyways, there's a few different types of ways you can go about this, but usually it depends on are you trying to converge and narrow down your thinking and your understanding of a topic, or are you still at the divergent phase where you're looking to explore and widen your aperture to capture all the different angles that you might need to be thinking about at this stage?

Erin May [00:16:32]:
Yeah, fantastic.

Erin May [00:16:34]:
I definitely want to get into how different teams, different roles, marketing, product, business leaders, et cetera, can use desk research and use it alongside perhaps primary research. That might be a good segue to talk about some of the research you're doing about research and what you're seeing in that research. So this will make sense when you start talking. It's very meta. Very, very meta.

Victoria Sakal [00:16:55]:
Yeah, it's been quite an exercise. But the idea is that again, companies go from having some sort of business objective growth, market share, you know, entering a new market, whatever it might be, and if you work all the way upstream, there's some sort of question that needs to get answered, there's some sort of work plan that needs to get laid out. So the first thing I'll say is, as we think about that pipeline, so to speak, you start with ideating. We talked about exploring and kind of widening your aperture. Then you narrow it down and you prioritize what you might want to focus on. You might validate before you prioritize. You might validate after you prioritize. But is there a there? There would be something you'd spend time on, and then you develop and you test and you roll out and then you iterate from there to see if the impact that you're trying to achieve is achieved.

Victoria Sakal [00:17:41]:
So when you think about that whole pipeline, what we were wondering was what are the questions that different companies and different stakeholders are asking? Who are the different stakeholders responsible for different stages of that pipeline and how do the things and the information needs that they have vary not only across the pipeline, but within a given pillar of it? So when you think about setting your strategy, you might start with a very specific degree from the board or the CEO. You might start with, what do our consumers need? Like, what is the pain point that we might just think about, or what is a new flavor of Doritos that we think that we can introduce and solve a new problem that no consumer ever thought that they had. But by the time you hit the end of that strategy stage, you might have narrowed it down. And there's some sort of, even if it's not explicit, there's some sort of stage gate that'll say, okay, we've got our strategy, we know what we're trying to achieve. Now let's go to ideation. And then the same thing happens as you move down that pipeline. So our exercise was to talk to a number of different stakeholders. You just mentioned everything from product and user experience to downstream marketing to upstream strategy teams, and then a number of different stakeholders in between to understand what their remit is, what questions they're asking, what tools they're using, what their pain points are with those tools.

Victoria Sakal [00:18:57]:
And then to your point, how are they intersecting things like primary, secondary and all these other different types of data and research to get the answers that they need to those questions?

Erin May [00:19:05]:
Yeah, great. And what do you learn?

Victoria Sakal [00:19:07]:
Yeah, so the first thing I'll say is that every company, of course, has their own variation of their pipeline and the stages that they use, but they all do tend to map to that continuum. And the second piece is the remit of different stakeholders. So again, I mentioned marketing and product marketing will tend to be later stream. It's interesting when you intersect that with the maturity of a company where great marketing and great product marketing will actually tend to be involved pretty upstream as well. And then similar to user research, it's not just plugged in at one stage. It sort of has the halos of user research will come back post launch and user researchers will be part of the conversation that go into development and not just tag you're in and see what they said about this prototype. So kind of thinking about that ecosystem approach is really interesting. The other piece is when you think about the level of confidence that companies are looking for today, first of all, we do see that every company is using internal data.

Victoria Sakal [00:20:03]:
If you're not mining and don't have that infrastructure, anybody who talked about it expressed pain with how difficult it is to get that information. And it wasn't in a it's fine, we moved on without it. This is a real weak spot for us way. The second piece is increasingly trying to move towards agility where, where can we get something that gives us a grain of insight around? Again, the strategy, the market, the category, the consumer that helps us then get tighter around what do we need answers to? Or, you know, what are we not getting from existing research, whether it's internal or desk research externally, to help us inform how we can start moving forward quicker instead of always needing to, like, check off all the boxes and be done before we move on to the next stage. The last thing is, you know, again, I mentioned this at the top of the conversation that it's sort of this dark horse where everybody is spending a ton of time on Google. And I felt this as a consultant who, you know, had to get smart fast on any category or client we were kicking off a project with. But you have everyone from, of course, the, you know, middle level kind of team members to very senior executives spending either part of their day job or nights and weekends kind of trolling through Google to get smart on different topics. So it's interesting when you think about different primary research.

Victoria Sakal [00:21:16]:
Conversations have always been part of the conversation where you've got to have a budget and you have a certain time to execute. And new solutions like these are interviews, like any number of these new tools are making it quicker and easier to do primary research, but the secondary research space hasn't really been tackled. And now we've got llms, but they've got their own challenges.

Erin May [00:21:36]:
Awkward interruption this episode of awkward silences, like every episode of awkward silences, is brought to you by user interviews.

Carol Guest [00:21:43]:
We know that finding participants for research is hard. User interviews is the fastest way to recruit targeted, high quality participants for any kind of research. We're not a testing platform. Instead, we're fully focused on making sure you can get just in time insights for your product development, business strategy, marketing and more.

Erin May [00:22:01]:
Go to userinterviews.com awkward to get your first three participants free. Yeah, for sure. Like the hallucinations you mentioned and what.

Erin May [00:22:11]:
Do you call it?

Erin May [00:22:11]:
Prompt engineering. Right. Just being good at using it. We've all had decades to learn how to use Google, and there's probably still many people out there who don't know Google scholar exists. Right. You know, so.

Erin May [00:22:22]:

Erin May [00:22:22]:
A lot of education to do in terms of using these new tools well, but also fact checking them and making sure the information that you're getting is good information.

Erin May [00:22:32]:
Yeah, absolutely.

Victoria Sakal [00:22:33]:
I mean, it's been interesting even to see that people are feeling optimistic about AI. There's definitely caution, there's some skepticism, but there's generally optimism around using tools like AI. But any of the conversations I was hearing, it was sort of like, it can help me do things a little bit faster. But that trade off then comes into exactly what you're saying. The validation, the credibility. And so we get into this problem where people are spending lots of time either on Google or in the llms, and yet they're all still feeling like when we have the surveys we have the conversations with, and they're all still feeling like I need to be doing more of the landscape assessments and the consumer insights and some of these higher level but pretty foundational, you know, we go back to the four P's and the five C's, like people still need ongoing insight into those things and they're too one off right now because it's such a time suck to do the googling and the llming, whatever the verb is there to get those answers right.

Erin May [00:23:27]:
Any trends in the kinds of questions people are asking that you're seeing come through?

Victoria Sakal [00:23:31]:
So the two levels of this are the types of questions which are pretty consistent. So landscape competitors, consumer dynamics. There's a lot of questions relating to sizing, so RoI and SWOT. And then the second piece would be on literally what people are worried about, of course, a lot on AI and how do we use it, what should we be thinking about it, but also what are our competitors doing, what's happening in this space? Beyond that, it's been actually fascinating. We have everyone from banks of financial services to a number of consulting firms, even CPG. And the minutiae of the types of questions and the very specific things that they're asking suggest that you might typically go to primary respondents for these types of answers. But actually there is enough out there that they're coming back again and again for good insights on some of these chemicals and people's perceptions of this, that and the other thing. But I can imagine, and again, as I experienced in our own research effort, you can get smart pretty quickly on what's out there and then the questions that you layer onto it.

Victoria Sakal [00:24:31]:
With primary research, only ten x is the level of insight versus just starting at zero and going to ten. You can take it even further because you've started at ten now with that foundation.

Erin May [00:24:43]:
Yeah, I'll ask you a kind of speculative question, but maybe one rooted in observation, which is with AI and with this really potential exponential speeding up of gathering of secondary information that's happening, where do you see primary research evolving alongside that? Where is it most useful when you can get so much, I guess, breadth of information? Otherwise, how do you see that evolving in terms of the value of primary research?

Victoria Sakal [00:25:14]:
Yeah, it's a great question because I think going back to some of the, I won't call them like, downsides or flags, but just the nuances of secondary research is the nuance around the why. First of all, is always going to be challenging. Like, you can get a certain level of that from somebody else's published report, but they probably aren't asking it with the same context in mind as you are. And then the nuance around the exact type of person and your why, like, why you are asking the question, and whether that's reaction to a user experience or the flavor of a chip or whatever it might be, the perfect answer is probably not going to exist, but a pretty close answer will. And so when you think about that, getting to that perfect answer and having much more confidence whether it's to ship a version of a prototype or otherwise, that feels like it's only more important from a primary perspective. It's also where you get into this. Interesting. I'm interested if you guys have had the conversation or seen it come up, but synthetic respondents, to me, feels like desk research jammed into a primary person's body.

Erin May [00:26:16]:

Victoria Sakal [00:26:17]:
Yeah, I don't know where that will head, but I'm not optimistic about that.

Erin May [00:26:20]:
Yeah, well, we have a healthy skepticism of that, of course. Right. You always want to be on the cutting edge of technology, but we're hoping that humans remain an important part of research on the participant side, for sure, but very interested to see, we've seen also AI on the other side, on the moderator side. Again, hopeful that moderators have some value is reading body language and adjusting to their responses. And, you know, that human to human connection. But are there cases where maybe it's not necessary? Probably.

Erin May [00:26:51]:

Erin May [00:26:52]:
So it's figuring that out. What's that depth of, why that I need that depth of information? And how do we use primary kind of one on one research to really validate and really get to that emotional core of things that we can't do through some of these cheaper, faster, more scalable methods.

Victoria Sakal [00:27:08]:
Yeah, you raised a good point, though. With primary research, I think another super essential component of it, which will always and has always been the moat for primary, is basically that it boils down to questions that haven't been asked already, whereas with secondary, the whole premise is someone has asked the question, maybe not, you know, gotten a direct answer, and it, maybe it can be manipulated to answer your question. But when you think about these forward thinking innovations or launches or features, if you have somebody having already reacted to or answered, it's actually not innovative. It might be helpful for a certain use case, but anticipating or kind of ideating is even the best spot is going to average together what is known in the corpus of information that exists, and it's going to have a hard time wrapping itself around. We know a recession is coming, or I'm aging and the dynamics of aging are changing, whatever that is. A human will have that context. I'm skeptical if it ever will.

Erin May [00:28:04]:
Yeah, no, it's a great point. You talked about with the Doritos before. It's a. Could learn a lot about America's favorite flavor of Doritos. Is it Nacho or cool Ranch? I don't know. I think cool Ranch is the best. But you learn a lot about that. But it won't tell you what new flavor to create.

Erin May [00:28:18]:
You can see gaps in the market. But, yeah, that's a great point. That innovating made more possible by what do you actually ascertain from that human connection and also the Pii side, right. Where someone might have asked these questions before, but they're not going to just be broadly available to the world for security and privacy reasons as well.

Erin May [00:28:38]:

Victoria Sakal [00:28:38]:
I'll be interested to see how things might unfold with publishing research results, knowing that AI is getting even better with mining when it's publicly available. Obviously a lot of these things were published. There's the whole ungated conversation as well. They were published with the intent of being consumed publicly. But if we'll start to see that people are being less liberal and sharing their research results, because AI can pick it up and surface that insight and answer for anyone and everyone.

Erin May [00:29:05]:
Now, yeah, you mentioned the pipeline of research and converging and diverging. Are you seeing folks or most of your clients, for example, or what you see folks using kind of upstream at that early level, or is it everywhere in the decision making process that this kind of research can be useful?

Victoria Sakal [00:29:22]:
It does tend to come in early half of that pipeline, I'll call it things like development and testing. There's obviously a different type of research that tends to be prominent there. But I'll also say that within each of those steps, it's the top half of the step where you tend to diverge and collect your information, and then the bottom half is more about refinement, thinking, synthesizing, and then heading into the next step where you would have this type of research again.

Erin May [00:29:47]:
Yeah. Cool.

Erin May [00:29:48]:
What else should we cover about desk research that I didn't ask you?

Victoria Sakal [00:29:52]:
Yeah, I think maybe even taking a step backwards because something we talk about and think about a lot in our company is this whole concept of curiosity and asking questions where the dynamic that Google has created, has created a lot of access to information, but in a professional context. Again, we talked about how much time it takes to find these answers. And we have day jobs that are not typically go and sit on Google and explore the things and get answers. We've sort of combined that with our natural tendency as adults who've grown over time and needed to have answers to questions, to stop asking as many questions, something. We're, of course, trying to get answers to people quicker, but actually just open up more space that people can ask more questions and explore more ideas. And so this element of curiosity is as helpful for just having time and headspace to explore different threads and see what comes of them. But also when you think about where innovation come from, whether it's a vc or in a company or otherwise, the 80 20 rule is how businesses grow. Right.

Victoria Sakal [00:30:52]:
The more questions that you can ask and the more opportunities or threads or bets that you might be able to explore, the better chance one of them is going to, or more of them, 20% of them are going to really be the future of your business. So you combine that with this idea of how do you get incrementally more confident in the information you're collecting, do a little bit of desk research and get that much tighter. We think a lot about the one in 60 rule in aviation is that a one degree change in your direction at your starting point over the course of 60 miles can change your ending point by about a mile. And so that could be great, or that can be really bad if you're really off course. And so spending that extra second, or, you know, in our case, we've gotten it down to a matter of hours. Instead of spending a week due diligence in a topic, spend the extra time, ask the extra questions. So you're not only expanding your portfolio of opportunities, but building that muscle in yourself and in your organization, and ideally tightening your precision with landing in the right place.

Erin May [00:31:49]:
Yeah, you bring up an interesting point, which is, I wonder, you talked about the known unknowns, but then there's the known unknowns as well. Are there habits or cadence of asking? I don't know, just sort of out there questions that can be helpful for keeping that curiosity going. Right. You mentioned competitive research. You want to keep an eye on your competitors. There are these things you probably think to look at on an ongoing basis, but is there a way to kind of nurture that habit of staying curious about things you might not even know you should be curious about to keep that edge?

Victoria Sakal [00:32:20]:
I think part of it will come down to that again, professionals are strapped for time and don't even have the headspace. And we always talk about freeing up time on your calendar. Just like think and strategize never really happens. It's hard to do. But there's an element of, like, how do you create that space to be thinking and wondering and not jumping to the answer? So plenty of ways to build your curiosity and question asking muscle the other side. And this is what's unique about our AI that's built into our platform is it's actually a divergent thinking model. We've done this for a decade now and have ontology of all the types of questions that intersect on the back end. To say, if you're doing a competitive landscape, here are ten things you might want to think about.

Victoria Sakal [00:32:59]:
Or once you've done the competitive landscape, here's five different directions you might want to go next. And so when you're talking with our chatbot around, I want a competitive landscape of all the research repositories out there, it's going to come back to you and ask thought provoking questions. Are you interested in this? Are you interested in that? We tend to find that about 95% of the people who come through and start with one question and either with another or with a ten x version of the question, because just practicing the have you thought about this? Or. Yes. And that improv emphasizes pretty heavily, can be pretty powerful for just expanding thinking that way.

Erin May [00:33:34]:
Yeah, be careful with those questions. They tend to beget more questions. To your point. Yeah, I think also it'll be interesting. This is super speculative, but just with AI, and again, just the speed of the passing of information, that's rapidly happening, where more questions and answers can obviously be processed now than before, probably the brain's capacity to retain them has not grown as fast. And so I do think more and more of our intelligence is sort of sitting outside of our brains, but hopefully easily accessible on demand. But I think that's interesting, right, how that's going to change how we work and how we think, where we just can't, like, we can't double, triple the size of the capacity of our brains, but can we answer questions quickly when we need to?

Erin May [00:34:19]:
I think so, yeah.

Victoria Sakal [00:34:20]:
That's where we started exploring this research repository space, if nothing else, to help our clients figure out how to mine this almost internal corpus of the brain, of the organization. Because as there's all these new ways to get information, as they're becoming more accessible, and as we have AI that we can unleash on this internal body of everything from product data to social listening findings to primary or secondary research findings. But getting all of this in one place, avoiding duplicating efforts. Connecting dots. We talk a lot about collecting the dots, so we talked about that big Google sheet of all of your links. And then there's actually the so what that happens from that, the connecting of the dots and figuring out the implications. That's where tools like research repositories will be interesting. And not just dumping grounds for all of these different documents and files, but actually helping to mine insights and draw connections and be predictive in terms of it looks like suddenly this category is coming to your website a lot like either monitor it or intersect that with social listening commentary around what this category is talking about and reroute your messaging because of that.

Victoria Sakal [00:35:24]:
But figuring out how to stay on top of all these gold mines of insight and surfacing actual actionable information from that will be, I think, the next frontier here.

Erin May [00:35:33]:
Yeah, agreed, agreed. All right, let's do some rapid fire. What is your favorite research method? Maybe aside from secondary research, although go ahead. Or interview question. Yeah, something you like to ask the bots or the people?

Victoria Sakal [00:35:46]:
Yeah, probably both. Related, although I will avoid talking to the research bots as long as I responded bots as long as I can. Thinking about how you can be as agile as possible with research is always fun to me. I started my career at Kantar where very robust and talented research efforts in place. But they were slow and they were expensive. And working with a lot of companies on the west coast, it became clear that we have to think about getting answers in creative ways. So I tend to explore with candidates how, you know, if you had no budget and you have the same time constraints, how would you find answers to all these different types of questions, whether it's competitor, category, consumer insights, or whatever. And so understanding the different tools, a that they might be aware of that I'm not, but b that they would put into their research stack and use a little bit more iteratively and agilely is always an interesting exercise.

Erin May [00:36:36]:
Yeah, awesome.

Victoria Sakal [00:36:38]:
Love that.

Erin May [00:36:39]:
Some resources that you like for information on research, doing research tools, things that you recommend to others or use a lot yourself.

Victoria Sakal [00:36:46]:
Yeah, perhaps intuitive given I ended up in desk research, but I do spend a lot of time reading different newsletters and podcasts and kind of mining together the dots of, you know, what Lenny talks about and what, you know, Scott talks about in his newsletters. Seth Roden but for me it's about crossing a lot of disciplines as much to understand different approaches and how people are, you know, thinking about their business or product strategy as it is. Getting a good empathy, a level of empathy for what different organizations and functions are working through, even within my organization. So happy to share additional research, but like, Lenny's podcast is great on the product and the strategy and the innovation side. There's a number of different podcasts and newsletters that I tend to kind of scan over the course of a given week, and then the other piece of it is around a little bit more interdisciplinary. But when you think about asking questions or curiosity or how do you explore innovation? There's a lot that comes from science and kind of experimentation strategy. There's a lot that comes from academia. So any of those intersections of actual ways to build a business and then actual ways to think about asking questions and building the curiosity muscle are where I spend a lot of my time.

Erin May [00:37:56]:
Time. Awesome.

Erin May [00:37:57]:
Lenny and Scott Galloway pivot. Different one.

Victoria Sakal [00:38:00]:
Scott Clary, actually.

Erin May [00:38:02]:
Oh, Scott Clary.

Erin May [00:38:03]:
Yeah, we'll have to get those links and put them in the show notes for sure. And where can folks follow you? Are people still tweeting? Are you threading? Are you linkedining? Where are you?

Victoria Sakal [00:38:12]:
I am on LinkedIn. Have not come back to x and some of the heyday there.

Erin May [00:38:20]:
Awesome. Well, thank you so much for joining us today. This has been really interesting and I learned a lot and I'm sure our listeners as well.

Victoria Sakal [00:38:26]:
I'm happy to hear it. Thanks and have a wonderful rest of your day.

Erin May [00:38:31]:
Yeah, you too. Thanks for listening to awkward silences brought to you by user interviews theme music by Fragile gang hi there, awkward silences listener thanks for listening.

Victoria Sakal [00:38:57]:
If you like what you heard, we.

Erin May [00:38:59]:
Always appreciate a rating or review on your podcast app of choice.

Carol Guest [00:39:03]:
We'd also love to hear from you with feedback, guest topics or ideas so that we can improve your podcast listening experience. We're running a quick survey so you can share your thoughts on what you like about the show, which episodes you like best, which subjects you'd like to hear more about, which stuff you're sick of, and more just about you. The fans have kept us on the air for the past five years.

Erin May [00:39:22]:
We know surveys usually suck. See episode 21 with Erica hall for more on that. But this one's quick and useful, we promise. Thanks for helping us make this the best podcast it can be. You can find the survey link in the episode description of any episode or head on over to userinterviews.com awkward survey.

Episode Video

Creators and Guests

Erin May
Erin May
Senior VP of Marketing & Growth at User Interviews