#32 - 7 Reasons Not To Do User Research with Michele Ronsen
Michelle 00:00
If the stakeholders are involved from the onset and they may not be like believers to begin with, but if I can convert them and if I have my stakeholders that are active, engaged, like 90% sure that I'm going to be able to move from insights into action and get that team to act and address the learnings along the way. And that's how I measure my success. It's not just in providing the information that they're looking to learn about. It's about moving the teams to be able to act on that information. You know, quickly and efficiently and effectively.
Erin 00:34
This is Erin May.
JH 00:36
I'm JH Henry Forster. And this is Awkward. Silences
Erin 00:41
Hi everybody and welcome to awkward silences. We are here today with Michelle Ronson. She is a consultant and educator of all things design and UX research. She teaches at general assembly and UC Berkeley, and has a thriving consulting business as well. Today. We're going to talk about something I'm really excited about, which is look, like we love user research around here a lot. But, it doesn't mean you should always do user research and we're going to talk about when you should not do user research. So this should be a fun one. Thanks for joining us, Michelle .
Michelle 01:19
Thank you for having me here.
Erin 01:22
We've got JH here too.
JH 01:23
Yeah. This topic seems a little blasphemous for us, so, but I'm nothing if not, open-minded.
Erin 01:29
All right. So Michelle, get us started. What is just, what is one time, one reason, one might not want to do user research.
Michelle 01:40
That's a great question. Firstly, maybe, kind of set the context. We've many people sharing my delight of the recognition of how critical user research is in the product development and service development cycle. And more individuals and organizations are excited to learn more about user research and how to do it. Think we are bordering on becoming sort of a feedback, obsessed culture, and that's a totally different topic.
Erin 02:13
Is that a good thing?
Michelle 02:15
You know, I think it has its pros and cons.
Erin 02:18
Yeah. I personally have mixed, mixed emotions when I read user obsessed. It's like, well, maybe you should chill out a little bit.
JH 02:27
Yeah. It's a good point. Right? Like you can only hedge your uncertainty to a certain level, like you can't eliminate it fully. And so at what point are you hitting diminishing returns is pretty important.
Michelle 02:36
I think that's a great question. So, which is also a great segue into this list that I compiled, seven reasons not to conduct user research. And it came about because I field a lot of questions for my students and my clients about what's the best way to approach X or what's the best method to use for why or how we phrase XYZ. We're trying to learn about 1, 2, 3, and not all of these questions. Our user research problems. Many, many different questions can be answered by different means that don't have to do with interviewing or interacting with customers directly. And one of the first things I like to communicate in my classes and with new clients is my personal preference is to let's look under our own hoods first. Like let's find out what we could learn about. Without talking to people like, well, there's a lot of experience in the room. If you're asking the question, there's some people who know something about the topic. The two people that I like to seek out first in an organization or the data analyst and the customer support people, people that are handling help tickets. So you have an existing product or you have something that exists out there or a competitor does. And there's so much to learn from what's already out there. So. I think that if you can answer a question better with analytics, use analytics or at least start there, and that will help you make a more informed decision about what to do later on.
Erin 04:17
How do you know if you can better answer a question with analytics? And if you can't?
Michelle 04:23
I think the first question I would ask myself and how I, how I like to teach this is, do you have something that exists? Do you have something that is out there in the wild, in some sort of format? And if the answer is yes. And then the next question is, is it alive somewhere. And when you have some sort of analytics, whether it's just Google analytics and you can tell where these people are coming from or where they're clicking or where they're going, or they're abandoning or keyword searches, or something a little bit more robust. If it's, if it's a product or service that's been out there for even a couple of months and you have any sort of tagging going on. You can, you can learn a hell of a lot
JH 05:09
On the customer support side. How do you go about making sense or picking up on trends and what they're hearing? I feel like it's such a rich source because they're on the front lines, but often it's not really organized or necessarily prepared in a way to just dig through. So how do you actually start to find insights there?
Michelle 05:29
I should probably temper this conversation by saying that I am not a formally trained researcher. I'm a recovering designer. So my approach is probably somewhat unorthodox. So, the way I approach it is to introduce myself to the person that oversees the support. Group and just say very giggly. Hey, I'm a, I'm, I'm a researcher and I'm working on this topic that I understand that you help oversee. I'm curious about, let's just make something up. I'm curious about the onboarding process. What do you think what's going on here? What's working, what's not, and then, what sort of suggestions have you made, if any, what sort of improvements have occurred over time? How often are you interacting with your product manager or designer if ever? And a couple times, like I'm the first person to introduce that training. And it seems like it's like, these people are such a source of truth and the same with the analyst. So I'm going to the analyst. Hey, I'm new. I don't know anything. I have this beginner mindset. I want to be your friend. I bet you know a lot about what's going on here. Let's look under the hood. Tell me what do you see? And then the three of us, I mean, those two people can tell you what's happening based on their, based on their records and based on their spikes and drops, but they can't necessarily see why it's happening. And that's where the user, it's just a terrific compliment, but I love having those two, like those are my buddies. Those are the first buddies I want in the company.
Erin 07:15
Okay. All right. You got a nice plug for user research in there. We knew, knew it was going to happen, but let's talk about why user research is the wrong, wrong thing to do. If analytics can answer your question, use analytics, what's another reason you maybe shouldn't do user research?
Michelle 07:34
So let me complete that thought if analytics can question in terms of what's happening. So for example, where are people dropping off in the onboarding flow or where's the friction in the onboarding flow? We can answer that question through analytics. Then we use that analytics to inform some sort of user research and follow up to talk to people about why that particular section in the flow is difficult. If there's something out there in the world, another example would be if time doesn't permit it. It, so if we're looking to understand, for example, the college application process, and we're looking to understand what's working and what's not.
But the, the, the team needs the answers in two days, we're not going to have time to do a proper diary study. We shouldn't force something within, because of a time constraint. We shouldn't truncate the appropriate methodology because of a time constraint. If that's a real time constraint, well, let's figure out another way to approach it. And maybe another way to approach it is to do some secondary research and comb through some reviews or some college application sites or something like G2 crowd and find out what people are saying about the problems or about this tool. So we can learn in other ways, but I'm not a big fan of shoving a square peg into a circle hole, if you will.
JH 08:59
Yeah. This is a super interesting one to me because I agree with you. Like we shouldn't force it or compromise the research process so heavily that it's not actually valid or useful anymore. My concern about it - and I wonder if you have thoughts on this - is it also feels like something that is a pretty common excuse people might use of why they're going to skip research and just follow their instincts or do what's in their gut. So how do you make sure that this is coming up for the right reasons? You know what I mean? And it's not something that people are leaning on as a lazy crutch to get out of having to do research.
Michelle 09:32
So if I'm understanding your question together, you're asking how do we get out of the, we don't have time sort of excuse.
JH 09:42
Yeah. Like I'm a, I'm a bad PM who likes to just do whatever he wants. And so someone's like, Hey, are you guys going to do research on this new feature? And I'm like, Nope, no time. Like that feels like a bad cycle to get into.
Michelle 09:54
So I don't really, I like to say no. But, what I like to do is come back with options and say, I know, I think this is the best approach right now, but what we can do is start to put our heads together and do more of like a rolling research
Series of studies where I will be answering, maybe specific questions or we focus on a specific topic and we get deeper and deeper and deeper as we go. So I can feed you information that will help you make more informed decisions. If we work together on a plan that can get you the information that will be helpful for you to progress. But what I don't think we should do is, very much to your point is, try to shove something in let's right-size. Let's look at the actual time we have, and I'm sure we can find something. You know, to work within that timeframe and better yet, let's be a little bit more strategic about it and step back and then plan something for the next six weeks or plan something for the next six months.
Erin 10:57
Yeah, that's a good point. I think, cause, the underlying issue can often be. Well, we never have time or we're, we're just not making research a priority in general, but I think your point here, Michelle is, well, you've made that mistake. Don't make it worse by pretending you're doing meaningful research when you haven't given yourself the time. Right. And make the best use of the time you do have, with the right method at which might be analytics to your point or something other than a longitudinal study or whatever.
Michelle 11:29
Let's look at your roadmap and find out what kinds of questions and what kinds of people we think we're going to want to talk to at certain phases in, the overall process. And then let me plan for that. And, I'm happy to do that. It's good. Probably going to be more meaningful to you anyway, because if we are a little bit more agile and we have a little bit more. Improv. I'm a huge believer to me. User research is part art, part science and part improv. And if we can all kind of get a little jiggy with it, right. And that's my goal, right? I want to get you, I want to get you the most meaningful input at the right time from the right people. Yeah. The time thing though. It's interesting to me and one of the pushbacks that I have is it's funny, you don't think you have the time to do research right now, but you think that you have the time to correct the mistakes that research might not have, you know? Right, right. So you invest the time they invested time later, but my suggestion is let's not stop anything. Let me ride sidecar along with you. And, I want to be your friend. So let's be buddies. I don't want to disrupt.
Erin 12:39
I think we'll call this episode Michelle Ronson wants to be your friend.
JH 12:49
What I love about that is I do think it's a very pragmatic perspective, right? Like I think sometimes it's easy to be idealistic and be like, stop the whole project. And we're going to make more time for research and to your point that while that might be great or the right solution, it doesn't win you a lot of friends in some cases. And there are other ways to get ahead of future problems in a way that might be better received by, by the different stakeholders and so forth.
Michelle 13:11
Absolutely. I mean, to me, the biggest predictor of success in a study or in a client relationship is how involved the stakeholders are. And if, if the stakeholders are involved from the onset and they may not be like believers to begin with, but if I can convert them and if I have my stakeholders that are active, engaged, I mean like 90% sure that I'm going to be able to move from insights into action and get that team to act and address the learnings along the way. And that's how I measure my success. It's not just in providing the information that they're looking to learn about. It's about moving the teams to be able to act on that information quickly and efficiently. So I'm out to make friends. I want to include them and Hey, they know more than I do about the product and they have different expertise and I want to learn from them. It will just be a whole, everything was just stronger and better. If we work in concert. Yeah. I sound like a hallmark card.
JH 14:12
Totally. Well, another one on your list was if you don't have stakeholder buy-in, that's not a great situation to do research within, and this kind of feels related to that. How have you seen that one play out?
Michelle 14:23
Totally. I mean, there have been times where I've been brought on in stealth mode. You know, the VP of something says, "Hey, I have a hypothesis. I want you to explore it, but you're going to be working in a vacuum because we can't tell anyone about this." But other than those situations, which are far and few between, I, again, I, I find that the biggest predictor of success is stakeholder involvement and I'm a big believer, from an education standpoint, I understand people learn in different ways and the different roles on a product development team, from your engineer to your product manager, to your designer, to your content strategists, like everybody brings something different to the table and I want to learn from them. And if we can all come together and get actively engaged and ideally tie the research goals to their goals, to their individual performance goals or their team goals, that will just increase the chance of success exponentially.
Erin 15:23
You talked about stakeholder involvement, right? And another phrase you said was stakeholder buy-in. Right? So those are different things. Buy-in and involvement potentially. Do you ever see a case where like, is all stakeholder involvement good? Do you ever get any like naysayers involved who are maybe not helpful or like beginners who don't really know what's going on with the research and find a way to use it for harm or is all stakeholder involvement just good and just be buddies with everybody and get everybody involved and that's going to be a good thing. Do you have any tips around kind of how to get stakeholders involved?
Michelle 15:59
You know, that's really interesting, no, it's not all good. I like to just first, I guess I understand their level of experience with user research in the past. If they don't have that, then do some quick education right there on the spot at the moment. So for example, my research plan is not going to be 12 pages. It's not going to include every single question in the guide. For me, the research plan will be successful if we are identifying the umbrella questions and understand very, very clearly how the research will be applied or what will happen to those learnings and when, and we can go from there. So to me, it's like, that's the first sort of level let's, let's just make sure that we're on the same page. So this is what we're trying to learn. This is why we're trying to learn it. And this is how we're going to apply those learnings. That's sort of the first step. The next step is okay. Let's coordinate, dig a little deeper and explore some components of the plan. And I also want to make sure that each person in the room is. Tied to, or will benefit from that exploration, in some way. But all authors, the document, I'll provide you with commenting rights. We'll get it to a point, but we have a deadline, so we'll get to a good enough point, and then move on. And these are living, breathing documents and. To be involved in the documents at some point, though, we have to either agree to disagree or agree to just keep progressing because we've got to keep moving. Let's try it out. You know, if it's not working, this is what pilot sessions are for as well. If we're not getting the questions, aren't clear. If we're not getting the types of responses we hoped for, or the depth, or we have too many questions or too few questions, you know, we do a series of pilots, at least one pilot session, if not a series to test that. Iterate and tinker along the way. I'm a big tinkerer.
JH 18:08
Hmm. Is a pilot session, just the first session or it's like, you try to space it out from the other sessions. So there's time to regroup and make changes.
Michelle 18:17
Really good question. I think it depends on the culture and it depends on what we're trying to learn and the maturity and the timeline of the overall project. I definitely find that the longer I work with a client team the more symbiotic the processes and we're able to move exponentially faster as that relationship grows for a couple of reasons. One they're more familiar with the process too. There's more trust that's been developed over time and three it's I've been able to demonstrate progress and results that have been to help them. Progressing makes more informed decisions with confidence. So everything just gets shorter. Like we two become a shorthand relationship, like with your partner or your roommate, you can kind of shoot each other looks about who's doing dishes or the difference, you know, it's a similar relationship, but sort of in a different format.
JH 19:16
All right, quick awkward interruption here. It's fun to talk about user research, but what's really fun is doing user research and we wanna help you with that.
Erin 19:24
We want to help you so much that we have created a special place. It's called user interviews.com/awkward for you to get your first three participants free.
JH 19:36
We all know we should be talking to users more. So we've gone ahead and removed as many barriers as possible. It's going to be easy. It's going to be quick. You're going to love it. So get over there.
Erin 19:44
And then when you're done with that, go on over to your favorite podcasting app and leave us a review. All right. When else should we not do research? One of the ones you talked about was obviously something we're going to totally agree with, which is when you don't know who you should be talking to, when you don't know who's going to give you the insights you seek. Tell us a little more about that.
Michelle 20:10
So gathering feedback from the right people is really paramount. Here's an example, and this comes up a lot, most in discovery calls or new business calls. I had someone approach me and ask if I would be interested in collaborating on a project with my general assembly students. And I asked him to tell me a little bit about the project and he was redesigning a website for a makeup provider, kind of like a Sephora but not type of retailer. And then in conversation with him, he thought it would be a good fit because it would give my students exposure to a real world project. And it would be a good fit for him because he would be able to gather a ton of actual feedback. And I was curious by the requests because to me, that's not at all a good use of the people that he should be gathering feedback from. Because first of all, half of my general assembly students are male, right? And my hypothesis is that most people buying makeup online are female. Secondly, the demographics of my students are probably more educated and perhaps more natural and more tech savvy because of where we're located in the bay area. And that might not fit their core buyer kind of, but he was sorry. He was very surprised. And he picked and poked at me and said, but don't you think this would be a great experience? And I said, I think this would be a great experience in what not to do. And it turned out to be a very fruitful conversation and he thanked me very much and he said, I just, I never really thought about it like that. And I said, gathering the right information from the right people. It's kind of like, that's foundational, right? We don't want to ask the wrong people questions because we're not going to gather meaningful feedback.
Erin 22:22
I say that all the time, you know, human beings are all wonderful flowers and everyone deserves to be heard, but you gotta, you know, be smart about who you listen to for what problems you're trying to solve.
Michelle 22:36
Getting feedback from someone who's 28, male. You know, a student who's 28 years old about buying eye shadow.
Erin 22:46
To be fair, could be into eye shadow
Michelle 22:49
To be fair, but the majority, I mean, as a
Erin 22:51
Yes. Right. If you're doing bespoke targeting. Yeah, absolutely.
Michelle 22:55
Absolutely not the right people to be asking. But yeah, it wouldn't be a good experience, but every different experience,
Erin 23:03
if you don't know why you're doing the research seems obvious, and how, and when the learnings will be applied - important part of that - pause right there. Tell us more about that.
Michelle 23:12
Yeah. So these three questions are pretty paramount. If you don't know why you're doing the research, I don't think you're going to be able to build a great plan and ask the right questions. Knowing that outcome or knowing that end goal will really be informative. And this is also a key reason to get your stakeholders involved too, because your stakeholders should all understand why you're doing it as well. So if we didn't want to do a general exploration, that's fine. You know, maybe we're doing it to become a little bit more informed about a product or service or a new profile or target that we might go after. So maybe we're doing it sort of in a generative way, so we can become a little bit more informed about what. That's a totally fine, you know, answer to why. But if we're, if we're doing something a little more, more tactical or we're doing something that requires, any sort of task based something or evaluative based something, or, you know, generative something, we want to know why we're doing it. If we can't answer the question, why. I would suggest ship at that point. And also if we can't, if you can't agree on why there may be more than one, why, and that's fine, but if we can't agree on the why, then we're not going to be moving forward in lockstep.
Erin 24:32
And I'm guessing this one sounds so obvious, but I'm guessing you've encountered this happening before.
Michelle 24:38
Yes. So, and that's really a clear indicator to me of how mature the organization is in regards to user research. and sometimes this is about finding my buddies and collaborating. In getting everyone to kind of sing the same tune. It's like herding a team of feral cats. Right. But you can do it if you can do it, it is so much more powerful and so much more successful. And again, you're able to move from insights into action, just so much.
JH 25:13
Do you find that it's the ones pressed to like state the why that most teams can usually, like, it's kind of floating around somewhere. They just haven't articulated it. And if they think about it for. They can narrow it down and articulate it or has it, there are actually some people who truly like, even when pressed just can not get there. And they're like, we have no idea
Michelle 25:33
Most of the, at least in my experience, it's, we're doing it to find out which one's better or we're doing it to find out which ones resonate or which ones are preferred, but then I'll dig deeper. It's like, but why do we want to know that? But why do we want to know that? Why do we want so, and, you know, using the five whys or laddering usually gets us there, but the engineer might want to know why for a different reason he might want to know, or she might want to know why, because he's thinking about how can I repurpose some sort of code. And the designer might want to know why, because that's going to influence some sort of pattern library. You know, that's being developed by her partner team at the same time and the product manager might want to know. So they each might be coming at this with slightly different angles, which is totally fine. I want to understand all those angles. Because again, my goal is to make sure that whatever we're learning about is going to be meaningful and impactful to that whole team so we can move that much more quickly into.
JH 26:41
Gotcha. Yeah. So it's like, just, don't let it go unsaid like actually get it all out on the table. Make sure everyone's why is understood and out there and then figure it out from there, sort of.
Michelle 26:52
Yeah. And by hearing the disparate views of why, and then understanding how it can be helpful, it actually brings us closer together as a team. And then the when is really important to you. So if you, if you have two months to explore this question, that, you know, will open up many, many different doors for how you might explore it. Versus if you have two days or two weeks. You know, so when, when would be too late for you to have this information is a great question to ask and why would it be too late at that date?
Erin 27:27
Okay. I wonder if, when you start uncovering the whys, right? Assuming that they're why there's some motivation there. Does that ever relate to another, another one of your reasons not to do user research, which is if you're just trying to sell your design that you've already come up with. Does that come up right where it's? I don't want to say malicious, but let's say not, pure intentions of just uncovering the truth. Right. Does that ever come out when you kind of dive, dig into these, these whys?
Michelle 28:01
Not as explicitly with the people that I work with, but I think what you're talking about is like research as a weapon and it can definitely be a weapon, you know, where with the toolkit and the access. I mean, we can, we can, we can pretty much blow anybody up. We want the ethics there, you know, prevent us from doing that. But you know, they're conducting user research or. Wanting to conduct user research, to mask it as a way to prove a point or to validate, you know, one direction or another is just wrong. It's just wrong. I mean, user research is, as an industry, we work in service to that, to that user, not to ourselves. And you know what, it's going to come back and bite you. Right. So that could be legitimate.
JH 28:57
This feels like a really hard one to detect like some of the other ones, right. If we don't have enough time, it's kind of obvious, hey, why are you doing this research? And someone's like pretty obvious. But someone has bad intentions and they're actually just trying to advance their own idea and sell their design. How, how do you actually discover that? Like how do you know. How do you put the brakes on it in this situation?
Michelle 29:18
Well, as a researcher, you're kind of the Maestro of the organization. So. I think what I would say is, are we looking to understand which concept resonates the most? Then when I'm building the script I'm going to, or the guide and going through the planning process and conducting the sessions, I'm going to make sure that there is or do my absolute best to make sure that any sort of bias is removed.
JH 29:49
Okay, that makes sense. Cool.
Michelle 29:51
So that's another great activity to do while we're all kumbaya is, let's get our biases and assumptions and hypotheses out on the table before we know that it's not right or wrong to have biases or assumptions or hypotheses. Like we all have, like, let's just be honest about it. And sometimes it's kind of fun. You know, I really want concept a, because you know, my mom likes to hand it over if it's purple or whatever. There's nothing wrong with that. Like let's, let's get it out again. It's going to bring us closer. Oh, your mom likes purple. My mom looks purple too.
Erin 30:25
It sounds like you're working with a lot of like very evolved people cause I don't know to me, I would just think if I, let's say I want my designs, you know, when this research test, because I spent like a lot of time on it or it's been my pet idea for two years, like, am I going to admit that. To myself or to another fruit. I don't know. It sounds like you're having good luck with getting kind of extracting that from folks.
Michelle 30:58
Well, I think it's about building that trust, right. And I think now, laying that framework and I try to go in with that beginner's mindset. And I don't remember if I said this, but I come from a design background. Like I know how painful it is to kill your babies. I totally remember that. And I still have to do it now, like in research. I mean, sometimes I can't use the method I really want to use or some other way, but I think it's about building that trust and it's, it's about. You know, really trying to approach it in a partnership and the more that we can build that rapport and that trust with each other. And the more we're going to open up.
Erin 31:45
All right. Last one, right? The question is too big, too small, and this gets into the time again. Do you have the right amount of time to talk to us about the importance of having the right size question?
Michelle 32:00
Yeah. This is a great one. And this one comes up. A lot in first conversations or discovery columns. So for example, I had a commercial real estate company, a series of conversations with them last month. And their original question was, we'd like to learn how to maximize. And that's a great question, right? But that's not necessarily a user research question. It's a way for a commercial real estate company can maximize their assets by, and I went through this, I don't know too much about commercial real estate, but I said, I hypothesize you could raise your rents. You could cut your utility costs. You could purchase more space. You could convert, you know, you could sell ice cream. No, you know, you could increase your services. There are so many ways to increase your assets. Like we want to focus. We want to let's right-size that question. Okay. So let's find out like, can we shorten the time it takes to apply to rent a property? So therefore we shorten the vacancy window. Like that would be, one way we can, like, we don't want to boil the ocean. So let's, right-size it. Should we look at all your utilities or should we look at the process to pay your utilities? Should we look at your rental rates and do like a comparison, which by the way, wouldn't be user research either that would read more murderers, but, that's another point we can add here is like, when is it market research? But that question, how do we maximize our assets? That's just, that's just way too big. It's way too big, too. To explore. I mean, maybe, maybe we want to do a generative study to find out the three most viable ways and then dive in, but it's just too broad and in the same respect we don't want to ask something that is way too narrow as well. So, for example, we don't want to spend a couple of weeks studying, if somebody can sign on to. a financial app.
Erin 34:17
I was going to say, say, say button color. That's the go-to—what color should the button be? I want to see the case study of when the button color, like just changed.
The business comes up so much. Oh, I have one. All right. Tell me about it.
JH 34:32
Okay. Yeah. When I was working at Vista print, we had all these custom landing pages that were made and they had product titles on them. And there was like a get started button or something. And it was very bland and kinda got lost in the page. And somebody ran an AB test to make it like a bright orange and it can increase conversion enough to be worth over like a hundred K a month and like profit. It was crazy.
Erin 34:52
That's a good one.
Michelle 34:53
That is crazy.
JH 34:55
Yeah. She was very junior and she's like, Hey, what if we just made this button like a lot more obvious when more people click it? And so they ran the AB test and it's like, did very, very well. And we're like, oh, great idea.
Erin 35:05
And I'm looking at her button color. All right. So, okay. So not, not too big, not too small Goldilocks questions. Obviously we can turn a lot of these nos into yeses by, you know, by inverting them. And right-sizing the question now, now we have permission to go and do some user research, right? What do you see most commonly, where do people get tripped up the most?
Michelle 35:34
I would see that, in terms of frequency, not leveraging your analytics. It's not marrying your analytics and looking under your own hood first. It was last year I was on a year-long consulting engagement and a well-known firm and the product team that I was working with the designer had never interacted with the analyst or the person in tech support, customer support didn't even know who it was actually. And never really thought to make that connection nor did the researcher who they had worked with prior. So it was just never a relationship that they had.
Erin 36:22
Different flavors. Similar thing is what about not looking at user research that's already been done, right? So not analytics necessarily, but you know, insights you might already have.
Michelle 36:33
Okay. What do we know about this? Have we explored? Internally, have we done some secondary research on it? How did we get here? How did the question get to this point? How did the question get to me? Like what happened to lead us here and in the ties also back to earlier conversations in terms of what do we know, what we hypothesize, what do we assume? You know, and that is generally based on. Or hopefully garner the conversation about what other work has been done in this space.
JH 37:07
What I am starting to think as I look at this list holistically, and now having talked about it is it does feel like a lot of things on here serve as like an early warning system where like the Canary, and if you catch these things early enough, there's a chance to like, right-size it or adjust and actually go on to have successful research. Right? And there's, if you can't fix it, like pull the plug to your point and the whole premise of this discussion. And I, and then the first thing comes to mind is like the whole checklist manifesto. It almost feels like this could be like a cool thing for teams to go down and be like, can we answer this with analytics? Yes or no? Do we have enough time? Yes or no, like, and actually kind of use it, like, have you, have you thought about using it that way at all?
Michelle 37:44
Yeah. I have a number of checklists and resources that I've developed. They're all on my website for download Ronson consulting.com. In the, in the resources section. And this is another great checklist. I have a couple of checklists up there. One is about how to evaluate bias and one is I think 15 questions you should ask yourself before you launch. And there's a great question starters kit in there. So, break down the typical phases in an interview. And then I developed a question bank, if you will, of three pre study questions or warm up questions or. 15. And then if you choose three from pretty much the pre and then the digging deeper, and then the wrap-up like you have a pretty good framework for, for a guide. I think that one thing that's important to discuss, if I can just pivot just slightly, is that there's no one single way to gather information and there's no one right way for a team to go about. You know, there's better ways, but if there's one thing that I've taken away from consulting for seven years in the business, it's that every culture is different. Every question is different. There are similarities, but there's a lot of differences. And if we get out of the mindset of like right or wrong, or the best way to do something, there's more room for improvement and the better you get. So it's better to just start somewhere, then, then not start at all.
JH 39:31
Yeah. I love that nuance. I feel like what's really the pivot kind of off that even, right. It's like, I think what's fun about the whole podcast format is it is a format that actually allows for nuance and some of that kind of like. Fine tune stuff that we're just getting into in a way that like a lot of our other online formats do not seem to allow for very well,
Michelle 39:48
what do you mean fine tune in the format?
JH 39:49
Just like, sorry, like what you were saying, like there's no way, one way to gather information, right? And so like there's a lot of nuance and context and all of this stuff that you have to kind of factor in to know how to go and do that in a given situation. And like in this type of discussion, It allows us to explore that and kind of appreciate that context and nuance and richness. And I feel like just when you see people like on Twitter or other common places, it's usually much more like absolute stances of the only way to answer a question is blah. You know what I mean? And instead of just a. That's what I really enjoy about these conversations is we actually get to get into some of that, like, well, it depends, and there's a lot of factors and there's not like an absolute way to do everything.
Erin 40:29
Right. And reading like a long, you know, it depends. Article is not enjoyable, really fast, right? It's like, well it depends. And then sub point B and then over it, now we're over here and I'm like, geez, God, but a. You know, in a conversation, I think it feels hopefully just engaging and natural and organic and dynamic three-dimensional, things like that.
Michelle 40:58
I think it is also authentic, right? One of the fascinating things about our industry and user research is that it's constantly moving. It's constantly changing. It's evolving. It's dynamic, it's living, it's breathing, it's amorphous. But I also think that's where a lot of the improv comes in. Right. So the podcast allows for this like improv to totally take place, which is so fun. I'm loving,
JH 41:31
close with my just being like how sweet is this podcast guys? So maybe we should cut that part better.
Erin 41:41
I think in improv. We're supposed to say yes and. Yeah, no bad ideas
JH 41:47
a friend actually, who now is doing, you know, some kind of product and design consulting type stuff. And that gets into research as well, but he does a lot of improv. And so he actually just did at a larger company, like an improv improv training course with our user research team. It's kind of like. I think on your feet and other stuff, I don't actually know how it went, but it sounded really fun. And I was kind of jealous that he got to do that.
Erin 42:12
Thanks for listening to awkward silences brought to you by user interviews.
JH 42:17
Theme music by fragile gang.