#183 - The Best Ways to Use AI in UX Research with Laura Klein of NN/g
Laura [0:00:00]: A lot of folks right now are selling this idea that here are the five simple prompts that are gonna change your life, and you can just use this prompt, and you will get all of your insights and right from your transcripts and none of that is true.
Laura [0:00:13]: I'm just getting I'm just gonna be very blunt about this.
Laura [0:00:16]: It is terrible it misunderstand the concept of insights that is not how insights work that is not like.
Laura [0:00:22]: So most of the places that people are using it in analysis and synthesis are...
Laura [0:00:27]: Those are places where I would say use with extreme caution, use that second set of eyes, use as the thought partner, but still do all the thinking work yourself.
Erin [0:00:40]: Hey.
Erin [0:00:40]: This is Erin May, and this is Carol Guest.
Erin [0:00:42]: And this is Awkward Silences is.
Erin [0:00:45]: Awkward Silences is brought to you by user interviews.
Erin [0:00:49]: The fastest way to recruit targeted high quality participants for any kind of research.
Erin [0:00:54]: Hello, everybody and welcome back to Awkward Silences.
Erin [0:01:03]: Today we're here with Laura.
Erin [0:01:05]: Find someone probably maybe red puck certainly.
Erin [0:01:08]: Other things host the podcast and now the principal experience specialist at Nielsen Norman Group, Laura.
Erin [0:01:15]: Thanks for joining us today.
Erin [0:01:16]: Thanks so much for having me.
Erin [0:01:17]: Got Ben here too.
Ben [0:01:19]: Hey everybody.
Erin [0:01:20]: And, today we're gonna talk about what else but Ai, specifically, Laura is teaching a class about Ai at Nielsen Norman Group.
Erin [0:01:26]: So it's actually really thought about this some practical applications.
Erin [0:01:29]: And we're gonna talk about the best ways to use Ai in research.
Erin [0:01:34]: Right now.
Erin [0:01:35]: Today is June seventeenth twenty twenty five.
Erin [0:01:38]: So these are the best ways right now, obviously, evolving conversation.
Erin [0:01:42]: But, laura, let's just start from why were you interested in teaching a class on Ai at and energy?
Erin [0:01:49]: So I've been doing research since the nineties.
Erin [0:01:53]: And I'm talking about research for probably fifteen, twenty years.
Erin [0:01:57]: And I love research.
Erin [0:01:59]: I love the process of it.
Erin [0:02:01]: I think it's fantastic.
Erin [0:02:02]: I think it's incredibly helpful.
Erin [0:02:03]: But I've been talking about
Laura [0:02:05]: the same kinds of things for a very long time because the research itself doesn't change wildly.
Laura [0:02:11]: Right?
Laura [0:02:12]: It's...
Laura [0:02:12]: We don't get massively new ways of conducting research, and this was an excuse for me to get to play with some of the the new toys and see what so it's entirely entirely selfish on my part, But also, you know, I think research is one of those things where researchers are busy these days.
Laura [0:02:32]: Like, they are super busy.
Laura [0:02:33]: Everybody is doing six different jobs and with half the people they used to and it's rough and I get it, and they don't have time to run out and learn everything there is about also being a prompt engineer.
Laura [0:02:48]: And not everybody is lucky enough.
Laura [0:02:50]: I I did a bunch of research with folks who are sort of doing this at scale.
Laura [0:02:54]: And not everybody's is lucky enough to have somebody, and, I have a re ops department, that is focused on making things better with Ai, and, you know, here's how to use it and have them teach it.
Laura [0:03:05]: So I think it's helpful to do that.
Laura [0:03:08]: Also, one last thing, ton of hype up there.
Laura [0:03:12]: Lot of products that are promising folks where you never have to talk to another human again, and I'm like, well, that doesn't seem right, and it turns out it's not right.
Laura [0:03:20]: It turns out that's very, very much wrong as we all sort of expected it would be.
Laura [0:03:24]: I wanted to dig into it.
Laura [0:03:26]: I wanted to learn less Hype what's real, and I wanted to be able to share that honestly with other people, so they could know what they could use, what they probably shouldn't use, what they might wanna use if it gets better, look at it from a the lens of safety and and efficiency and ethics and all of that.
Erin [0:03:45]: Yeah.
Erin [0:03:45]: Speaking of safety Sort of, like, a green light yellow light, red light.
Erin [0:03:47]: Right?
Laura [0:03:48]: Exactly.
Laura [0:03:48]: Yeah.
Erin [0:03:50]: One thing I've learned in my ears hosting this podcast is pretty much the correct answer to every research question is it depends.
Laura [0:03:56]: And so, I should worn my It depends shirt.
Laura [0:03:58]: I literally have.
Laura [0:03:59]: This is it depends a on it.
Erin [0:04:01]: There you go.
Erin [0:04:02]: And so I think this will be the perfect episode for that where a lot of the answers are gonna be It depends, and we can kind of get into what does it depend on.
Erin [0:04:09]: So there's...
Erin [0:04:10]: Oh maybe some frameworks folks can take when they think about what's working right now Again?
Erin [0:04:13]: I wanna emphasize right now?
Erin [0:04:15]: Because obviously, this is an evolving thing.
Laura [0:04:18]: I'll change tomorrow.
Erin [0:04:19]: Can I'll change tomorrow.
Ben [0:04:21]: Last year, Laura, we surveyed some user researchers and some ray folks about how they're thinking about or currently using Ai and their practice?
Ben [0:04:28]: And one of the questions that I wanted to to su out was how is Ai being used across the life cycle of research.
Ben [0:04:36]: And we found that Folks are using it primarily on the tails.
Ben [0:04:39]: They're using it in a lot of c design, c, help me think through what I need to do or how I tackle this, then they're going about setting about doing the recruiting in the fielding.
Ben [0:04:50]: Then they're bringing Ai back in a bit in the analysis and some readout.
Ben [0:04:53]: When you were designing this class, did you want to explore how Ai could work across all of the phases or did you go in thinking like, okay.
Ben [0:05:01]: I'm hearing that it's being used in this phase or this step?
Ben [0:05:04]: How did you approach designing this class?
Laura [0:05:07]: I went in totally clean and without I tried not to have any...
Laura [0:05:12]: Well, my bias originally was little skeptical, not little skeptical, Extremely skeptical.
Laura [0:05:16]: I mean, I've...
Laura [0:05:17]: That we've all read the stories about the places where it fails.
Laura [0:05:20]: So I went into it very much tell me how you are using it.
Laura [0:05:24]: And I found similar things to you.
Laura [0:05:28]: Lot of people, things that that we heard over and over, definitely using it as a thought partner.
Laura [0:05:33]: A lot of...
Laura [0:05:34]: Oh, I'm a user experience team of one I need somebody to talk to about this stuff.
Laura [0:05:39]: I want somebody to check over my stuff and see if it's right.
Laura [0:05:41]: I would need somebody...
Laura [0:05:42]: Like, I'd love somebody to read through my discussion guide and tell me what I'm missing or tell me, did I'm ask a leading question.
Laura [0:05:47]: Even, like, I just want somebody else to look at this stuff.
Laura [0:05:50]: And well, Ideally, we would all have our coworkers do that and do c working sessions, and that's really important that we're still doing those kinds of things.
Laura [0:05:58]: It's also nice to have that sort of second set of virtual eyes on it.
Laura [0:06:03]: So that we have that sort of extra check.
Laura [0:06:06]: Right?
Laura [0:06:07]: And it's that eager intern who isn't mad about having to read the transcripts through the seventh time.
Laura [0:06:12]: But it is true we did find when we're doing the research, it does much better at very specific tasks within the life cycle.
Laura [0:06:23]: And while, a lot of things are sort of promising, like, you just tell us what you wanna learn and will do the whole thing for you and spit it out at the end, and here's the sizzle and blah blah.
Laura [0:06:32]: Catch up in our experience.
Laura [0:06:36]: We have not found that to be the case.
Laura [0:06:38]: What we have found is that you've use it for sort of the thing, like, one or two things that you need it for and that you have a particular tool that will work for you to do it.
Laura [0:06:50]: And not if everybody uses it in the same places.
Laura [0:06:52]: And there are definitely some places that we recommend yet.
Laura [0:06:55]: Great here.
Laura [0:06:56]: And there are some places where we're like, yeah.
Laura [0:06:58]: I would use this with the utmost of caution.
Erin [0:07:01]: Maybe if you could talk about some of those places.
Erin [0:07:02]: So where have you found that tends to to work well and maybe not so well.
Laura [0:07:07]: All the transcripts and note taking yes, a hundred percent as somebody who spent a lot of years of my life.
Laura [0:07:14]: Boy in my twenties, doing transcripts and doing all of those making sure that I had all of those notes taken and sitting in rooms and just taking notes extensively.
Laura [0:07:24]: Boy, do I love having a machine do that for me?
Laura [0:07:26]: So things like that.
Laura [0:07:28]: Yes.
Laura [0:07:28]: Absolutely.
Laura [0:07:28]: A lot of folks who are doing stuff a little bit more, sort of, they're setting up things like custom Gp to help them do things like right discussion guides.
Laura [0:07:38]: If that is a thing that they do a lot.
Laura [0:07:40]: They're setting up custom gems, in Gemini to create research plans again, if that is a thing.
Laura [0:07:47]: If you do that, if you are part of this democrat amortization of research or as I like to call it doing more with fewer people.
Laura [0:07:53]: Then you might want to have something, that you can set up ahead of time that will help people with less experience than you do things safely.
Laura [0:08:02]: All of those things still have to be checked.
Laura [0:08:05]: A lot of folks right now are selling this idea that here are the five simple prompts that are gonna change your life, and you can just use this prompt and you will get all of your insights and right from your transcripts and none of that is true.
Laura [0:08:19]: I'm just gonna I'm just gonna be very blunt about this.
Laura [0:08:21]: It is terrible it misunderstand the concept of insights that is not how insights work that is not like.
Laura [0:08:27]: So most of the places that people are using it in analysis and synthesis are...
Laura [0:08:33]: Those are places where I would say use with extreme caution.
Laura [0:08:35]: Use as that second set of eyes uses the thought partner, but still do all the thinking work yourself.
Laura [0:08:42]: That's...
Laura [0:08:43]: It is not great at thinking.
Laura [0:08:44]: But one of the places that we have found that it is good at is meta analysis.
Laura [0:08:48]: So folks who have and we've got a bunch of examples of this in the class.
Laura [0:08:53]: Maybe you have a whole bunch of old research that is just sitting there.
Laura [0:08:58]: Nobody does anything with it.
Laura [0:09:00]: Maybe you have other systems that you just never look at because who has the time to go through all the customer surface tickets other than customer service.
Laura [0:09:07]: Right?
Laura [0:09:08]: And suddenly, you have again, a very eager, very fast intern who can go through this?
Laura [0:09:14]: Is it gonna be perfect?
Laura [0:09:15]: No.
Laura [0:09:15]: Is it gonna be better than ignoring years of useful data?
Laura [0:09:19]: Yeah.
Laura [0:09:20]: It might be.
Ben [0:09:22]: So I think about...
Ben [0:09:22]: That's really helpful context that you're tri some of the things that Aaron and I have heard and certainly that we've been hearing from our internal researcher, Morgan and some of folks on the team who are using Ai.
Ben [0:09:31]: So then thinking kind of putting my pe hat on, what are the learning outcomes that you have?
Ben [0:09:36]: Are you...
Ben [0:09:37]: It sounds like a researcher of today should be?
Ben [0:09:41]: You mentioned prompt engineering.
Ben [0:09:42]: It sounds like it's more corral your interns so that it can scale and move quickly without breaking too much.
Ben [0:09:49]: So would you walk us through some of the learning outcomes that you had?
Ben [0:09:52]: What are some of the things that you wanted students to sharpen and get better at with this technology that's there are just waiting to be used?
Laura [0:09:59]: I want people to be able to look at their process and make good decisions about where this technology might help them, might not help them would definitely help them.
Laura [0:10:14]: And also, I wanted to present a bit of a counter argument again to the...
Laura [0:10:21]: Here's a big goal list of prompts that people just get so much of they and they hear so much.
Laura [0:10:26]: It's gonna change everything.
Laura [0:10:27]: You can talk to fake users.
Laura [0:10:29]: You can do this.
Laura [0:10:30]: You can run a focus group.
Laura [0:10:31]: You can do out there.
Laura [0:10:32]: And, like, I just wanted to kinda give them a a counter factual of, actually, here's where it's safe to use.
Laura [0:10:38]: Here's where again, use a caution.
Laura [0:10:40]: Here's where maybe just leave it alone for a little while see if it gets better so that they can make the best decisions about it.
Laura [0:10:46]: I also have a section on how to choose tools as you said.
Laura [0:10:50]: This stuff changes every day, and every one of them promises everything.
Laura [0:10:53]: So we do not recommend.
Laura [0:10:55]: Yes, use this.
Laura [0:10:57]: No.
Laura [0:10:57]: Don't use this.
Laura [0:10:58]: We say, here's how to figure out if a tool might be helpful to you.
Laura [0:11:02]: And I think those are the kinds of things that just giving them the tools to figure out these things as they go.
Laura [0:11:09]: It's definitely not, like, a prompt engineering class.
Laura [0:11:12]: I am not the one to teach that, but here's sort of the things that you need to know about prompts for research.
Laura [0:11:19]: Right?
Laura [0:11:20]: Give it way more context than you think all those kinds of things.
Erin [0:11:25]: And and how does someone think about when they should use Ai in the sense of...
Erin [0:11:29]: Right?
Erin [0:11:29]: You talked about a couple of use cases where, like, Ai is no good for this or transcription.
Erin [0:11:34]: Yes.
Erin [0:11:34]: But I imagine two things are kind of happening in parallel.
Erin [0:11:37]: One, right?
Erin [0:11:38]: Researchers futures themselves are getting better at using Ai, at different speeds.
Erin [0:11:43]: It's gonna vary by researcher right?
Erin [0:11:45]: And then the Ai itself is getting better too.
Erin [0:11:47]: And so how do researchers think about, like, is this a good match for me right now or or is this an another thing or maybe later thing?
Erin [0:11:57]: Do you have a framework for that?
Laura [0:11:59]: Well, it's not so much a framework as sort of understanding what it's good at and what it isn't and good at?
Erin [0:12:04]: U.
Erin [0:12:04]: Yeah.
Laura [0:12:04]: So for example, on the moderation side of things.
Laura [0:12:07]: Right?
Laura [0:12:08]: One of the things that people don't really think about is all of the things that you do as a human moderator of studies that you don't even think about.
Laura [0:12:16]: Right?
Laura [0:12:17]: Beyond just asking questions what else are you doing?
Laura [0:12:20]: You're asking questions.
Laura [0:12:21]: You're asking follow questions.
Laura [0:12:22]: You're also noticing things.
Laura [0:12:23]: Right?
Laura [0:12:23]: You're...
Laura [0:12:24]: You can kinda tell by, the, the physical appearance of the person, are they feeling uncomfortable?
Laura [0:12:29]: Did they answer that sort of hesitant or were they very excited about it?
Laura [0:12:33]: What's going on here?
Laura [0:12:34]: All these sorts of things that happen.
Laura [0:12:37]: Ai doesn't do any of that.
Laura [0:12:38]: Right?
Laura [0:12:39]: Ai currently in in again, there are certainly Ai tools that claim that they can body language and tone of voice and all of that.
Laura [0:12:47]: None of them seem to be looped into any of the tools that people are using currently.
Laura [0:12:52]: So thinking in terms of what can it do and what can't it do?
Laura [0:12:58]: And what does it do, but, like, not as well as a human, but maybe good enough?
Laura [0:13:02]: Right?
Laura [0:13:03]: If you sort of think about that and then say, okay, Well where can I apply this?
Laura [0:13:06]: Because the answer might be, this is a contextual inquiry.
Laura [0:13:10]: I need a human.
Laura [0:13:12]: Right?
Laura [0:13:12]: This is actually a usability study.
Laura [0:13:14]: I need somebody who can look at the screen and understand that the person who was telling me that they just succeeded at a task.
Laura [0:13:20]: Very much did not succeed at that task.
Laura [0:13:22]: That's just a little thing that has happened to absolutely every single one of us at some point.
Laura [0:13:27]: So you really need that.
Laura [0:13:29]: But this is just a basic customer interview where we're just gonna kinda ask the questions and some follow it questions and just kinda...
Laura [0:13:36]: It's very much more of we don't need to see your face like we could even do it on the phone.
Laura [0:13:41]: Okay.
Laura [0:13:42]: Well, one of those sounds like it's gonna be a lot more of a good solution to get or a a good situation to give to Ai.
Laura [0:13:51]: So you have to know what is it good at?
Laura [0:13:54]: What are humans good at, and what
Erin [0:13:57]: are you willing to give up.
Erin [0:13:59]: Right.
Erin [0:13:59]: Control us.
Laura [0:14:01]: Right.
Laura [0:14:01]: Exactly.
Laura [0:14:01]: You also have to know very much.
Laura [0:14:03]: Wow.
Laura [0:14:03]: Do I have the kind of users who are going to block talking to a machine?
Laura [0:14:08]: Or do I have the kind of users who are gonna be more comfortable talking to a machine because they don't necessarily feel comfortable talking to a human about these topics.
Laura [0:14:14]: All of these questions, again, these are...
Laura [0:14:17]: They're very contextual questions for you and your research project.
Laura [0:14:22]: Right?
Laura [0:14:23]: Where does it fit?
Laura [0:14:24]: Also just we talk a lot about making these custom bots that can help you with research plans.
Laura [0:14:30]: But if you're, like me and you maybe...
Laura [0:14:33]: I don't do a lot of research anymore.
Laura [0:14:34]: But what I used to.
Laura [0:14:35]: Right?
Laura [0:14:36]: I would make a research plan and then I would do all this.
Laura [0:14:38]: Work, you know, like, how often do you make one.
Laura [0:14:41]: Right?
Laura [0:14:41]: Do I need a good template?
Laura [0:14:42]: Or do I need a machine that's gonna give me the research plan?
Laura [0:14:46]: Well, but again, I talked to a couple of people who are doing this across large organizations.
Laura [0:14:52]: Very helpful for them.
Ben [0:14:55]: Laura, it sounds like inventory if you're a card carrying researcher or inventory a step like kicking off a research project asking yourself like, cookie.
Ben [0:15:03]: What are some of the steps that I would typically do?
Ben [0:15:05]: What might I give ten percent to something like desk research wherein in you can?
Ben [0:15:08]: Do that a little more fu or maybe do it at all.
Ben [0:15:11]: A lot of our listeners, and I'm sure a lot of your future students will be people who are not researchers by trade, but who See Ai as a way for them to kinda dabble in research activities.
Ben [0:15:21]: Is your advice to them or, I guess, what is your advice to them or how might you teach a a group of say designers?
Ben [0:15:27]: Because your background is in design and you led design teams?
Ben [0:15:30]: If you were unleashing an Ai tool imaginary onto a group of designers, would your advice how would your advice shift for them for folks who newer to research as a set of practices?
Ben [0:15:40]: And then they're given this thing maybe baked into their fig or it's a plugin in for something else they're using?
Ben [0:15:45]: How does your advice change?
Ben [0:15:46]: How did your education change?
Laura [0:15:48]: It changes in a sort of a surprising way.
Laura [0:15:50]: I would say, make sure that you get an actual researcher to at least look over.
Laura [0:15:55]: Perfect.
Laura [0:15:56]: Maybe that's not surprising.
Laura [0:15:57]: But wow, these things can really give you some bad advice.
Laura [0:16:01]: And one of the things that I just want people to remember it's fine to use these things.
Laura [0:16:07]: It can absolutely speed you up.
Laura [0:16:09]: It can absolutely make you more powerful can.
Laura [0:16:12]: But it can also lead you entirely in the wrong direction and unless you are enough of an expert in the thing that you are asking it to do to at least check it over, you don't know if it did good or not, You just...
Laura [0:16:27]: There's no way for you.
Laura [0:16:28]: If you have never run a usability study, and you don't know what a good task looks like.
Laura [0:16:35]: And you ask it for tasks.
Laura [0:16:36]: Right?
Laura [0:16:37]: And it gives you very leading tasks.
Laura [0:16:40]: Will you know that?
Laura [0:16:42]: No.
Laura [0:16:43]: But would somebody who has done even a half dozen usability studies be able to look at it and kinda go Oh, wow.
Laura [0:16:49]: No.
Laura [0:16:50]: That's...
Laura [0:16:50]: You just told them to click on that button.
Laura [0:16:52]: They're absolutely gonna be able to click on that button, and that will tell you nothing.
Laura [0:16:55]: You need to ask...
Laura [0:16:57]: You need you need to have somebody who knows what they're doing to just at least check the output.
Laura [0:17:02]: That's why I recommend it for very senior people more than I recommend it for very junior people.
Laura [0:17:08]: This is extra true of desk research.
Laura [0:17:11]: For desk research, we saw a lot of people using it, and what I will say is trust but verify.
Laura [0:17:17]: Actually, don't trust.
Laura [0:17:19]: Use it as a starting point.
Laura [0:17:20]: Right?
Laura [0:17:21]: It is absolutely fine to use it a store.
Laura [0:17:23]: It's absolutely fine to say, tell me what kinds of questions I should ask here, or what do you think this kind of job does and then go out and actually learn the truth, but you cannot take anything it says as truth because it doesn't have a concept of truth.
Erin [0:17:42]: Maybe a a little site.
Erin [0:17:43]: I'm curious what you think about this Laura, which is for our younger folks trying to break in a research or newer and their research careers, and they're finding themselves in this moment where maybe their mentors or would be mentors are picking up Ai and using it more to scale themselves, things like that.
Erin [0:18:01]: How did they get that mentorship to level up in the craft of research itself so that they then can use Ai and know what it looks like?
Erin [0:18:09]: How's is this younger generation gonna learn what good looks like?
Laura [0:18:13]: Yeah.
Laura [0:18:13]: A great question.
Laura [0:18:14]: Okay.
Erin [0:18:14]: Great.
Erin [0:18:14]: I don't I'm an answer it.
Laura [0:18:16]: This is...
Laura [0:18:16]: We're seeing the same thing actually with engineering.
Laura [0:18:18]: I I used to be an engineering and I still follow pretty closely all the engineering stuff And we're running into this where it's sort, like, it's going to make senior engineers incredibly powerful.
Laura [0:18:26]: I think in in in some cases, It's going to make junior engineers worse than you can possibly imagine in many cases.
Laura [0:18:33]: And it's going to make no more senior engineers, which is a problem when we're gonna see the same thing with user research, the way that you get better at moderating a study is moderating studies.
Laura [0:18:43]: There are a few things that you can use it for as a junior that can help you.
Laura [0:18:47]: You can have it, for example, again, I wrote this discussion guide, check the discussion guide for leading questions.
Laura [0:18:54]: And it...
Laura [0:18:55]: What...
Laura [0:18:55]: Will it I find all of them?
Laura [0:18:56]: Maybe not, But will it find some of them?
Laura [0:18:58]: Yeah.
Laura [0:18:58]: Probably, will it make it a little better?
Laura [0:18:59]: Yeah.
Laura [0:18:59]: You can use it to run pilot studies if you don't pay attention to any of the results, but just to get you to feel comfortable you know, talking to a person and having them give you feedback and you can ask it to grade you on how you did and give you feedback on that?
Laura [0:19:15]: Again, will it be perfect?
Laura [0:19:16]: No.
Laura [0:19:17]: Might it make you more comfortable with doing the process of interviewing?
Laura [0:19:21]: People have reported that it has...
Laura [0:19:23]: I have not tested this out.
Laura [0:19:24]: I will be perfectly honest.
Laura [0:19:25]: I have interviewed people before.
Laura [0:19:28]: So I feel like I'm okay at that.
Laura [0:19:31]: But people have definitely said, no.
Laura [0:19:33]: That made me just feel much more comfortable doing it.
Laura [0:19:34]: So there are things you can do.
Laura [0:19:36]: But, yeah, we're gonna see folks not pick up some of these skills like good analysis and synthesis if they're just plugging the transcripts in and saying, give me all the insights.
Laura [0:19:50]: Like, Yeah.
Ben [0:19:52]: Yeah.
Ben [0:19:52]: Last week, laura, I was chatting with a senior.
Ben [0:19:54]: I guess you might say this person was a principal user researcher sort of half and reports to a senior product person leads and mentors junior researchers, but it was really sort of focused on their own workflows.
Ben [0:20:06]: And was given this challenge I'm using air quotes for our listener challenge to bring Ai into the organization.
Ben [0:20:12]: And that is really what I'm interested to hear from you it is the things that are outside of the control of a the average research team or product team and design team, and that is namely executives getting really hurt like, excited and anxious that they're not using Ai, whatever that means.
Ben [0:20:27]: And so I'm wondering, well, this researcher said that they believed the future role of a researcher is not going to be to do research gonna be to, like, w how Ai is used across the organization and to do what you said there.
Ben [0:20:39]: Make sure that the designers understand how to do correct interpretation of results or that the data science team maybe doesn't as much coaching.
Ben [0:20:47]: But that these very stakeholder teams are thinking about how Ai can be used in a sort of being a validation check?
Ben [0:20:54]: How do you see Ai I'm asking you to speculate.
Ben [0:20:57]: How do you see Ai changing the role of the researcher moving on?
Ben [0:21:01]: Is it going to be in this operational sort of organizer role or something else?
Laura [0:21:06]: I mean, I think we've been promised this a lot of times with all sorts of new tools, and I think that I think it will help some of the folks who actually are in good research operations.
Laura [0:21:17]: Teams to make some of the things that they have to do on a regular basis easier.
Laura [0:21:23]: Right?
Laura [0:21:24]: So that is good.
Laura [0:21:25]: That I think that is obviously absolutely a good thing.
Laura [0:21:27]: Is this going to be used as an excuse to have fewer researchers Yeah.
Laura [0:21:33]: Almost certainly.
Laura [0:21:34]: Is that a good thing for the products?
Laura [0:21:36]: No.
Laura [0:21:37]: Not in my experience nor do I think it will be?
Laura [0:21:40]: And I don't know that the folks who are gonna be making these decisions are necessarily going to have this kind of nuance to insight into, we can replace these things we can't replace these things.
Laura [0:21:49]: These are things that we still really want humans to do.
Laura [0:21:52]: If you use it to kinda do the repetitive easy stuff.
Laura [0:21:57]: If you use it in that way, That's great.
Laura [0:22:00]: Sure.
Laura [0:22:01]: Absolutely.
Laura [0:22:02]: We'll see some productivity gains, but I'm worried that it is being sold to people as, again, you never need to talk to another human.
Laura [0:22:10]: You can just ask the magical machine.
Laura [0:22:12]: And we've done benchmarking.
Laura [0:22:14]: Like, we've looked at what it says versus what humans say and it is not good.
Ben [0:22:19]: And so is that how researchers make that case?
Ben [0:22:21]: Because I, again, we're hearing that, like, researchers are so is that looking at benchmarks?
Ben [0:22:25]: What are some other ways that researchers can combat that?
Ben [0:22:27]: The idea that they're gonna be educating or sharpening a model that might someday replace the moderation sessions, the focus groups, the usability tests that really help them move a company forward.
Laura [0:22:39]: I think you really wanna double down on the stuff like I said that humans are very, very good at.
Laura [0:22:43]: I think the other place that we really need to double down is one of the things that we have seen over the twenty five years that I've been watching and doing research and talking about and all of that.
Laura [0:22:53]: One of the things that that we really have seen is things like analysis and synthesis.
Laura [0:22:57]: Are better for the team and for the company and the product when we do them together.
Laura [0:23:03]: Not necessarily because we come up with better insights.
Laura [0:23:06]: Not because I couldn't go off into a room and come up with all those insights on my own as a genius.
Laura [0:23:10]: Come on.
Laura [0:23:11]: You know?
Laura [0:23:11]: We could.
Laura [0:23:12]: You and I could.
Laura [0:23:12]: But we could go and We could get all the best in.
Laura [0:23:14]: All the
Erin [0:23:14]: best Hashtag laura got it.
Laura [0:23:16]: Hashtag I got this.
Laura [0:23:17]: It's fine.
Laura [0:23:18]: You don't have to do it.
Laura [0:23:19]: No.
Laura [0:23:19]: I'm hundred percent.
Laura [0:23:21]: Joking.
Laura [0:23:21]: God.
Laura [0:23:21]: I hope everybody knows.
Laura [0:23:22]: But we do come up with better insights, but more importantly, when we all come together and look at the data and meet the users and watch the videos, we understand the insights and we internalize that and we come up with the insights together and we believe them, and we move forward and those become part of the sort of lore of the product and the company.
Laura [0:23:50]: I can't tell you how often I've had, you know, even engineers come in and sit in research sessions.
Laura [0:23:55]: And afterwards I'll be talking them months later in the bit.
Laura [0:23:57]: Well, remember when that person said x y z, and I'll be like, yes, I do remember that.
Laura [0:24:03]: And I think it's a great application of that in this feature that we're currently building.
Laura [0:24:07]: I mean, how great is that whereas if I were to go off into my little cave and come up with all of my junior insights and send them a deck of those insights.
Laura [0:24:17]: We've all seen what happens to those decks?
Laura [0:24:19]: Nothing.
Laura [0:24:19]: So those don't become part of the conversation.
Laura [0:24:25]: So if you really look at the places where could a machine come up with these insights as it turns out, no.
Laura [0:24:32]: But even if it could.
Laura [0:24:33]: Even if we got to the point where it was every but as good as human by doing this and much faster, what it would give you is a list of insights, which are then very, very easy to ignore because nobody went through the process of coming to them and agreeing about them and seeing where they came from.
Laura [0:24:55]: So there's all sorts of things with moderation, like I said, I don't think it's gonna get good enough to notice.
Laura [0:25:02]: Oh, well, I was in a usability or not a usability.
Laura [0:25:04]: So I in a contextual inquiry and I was asking somebody about something, and they pulled a piece of paper out of their desk drawer and started looking at it, and oh, that's not a thing they ever mentioned.
Laura [0:25:15]: But you noticed that, and the machine doesn't.
Erin [0:25:20]: Awkward interruption.
Erin [0:25:21]: This episode of Awkward Silenceslike every episode of Awkward Silencesis brought to you by user interviews.
Erin [0:25:27]: We know that finding participants for research is hard.
Erin [0:25:30]: User interviews is the fastest way to recruit targeted high quality participants for any kind of research.
Erin [0:25:36]: We're not a testing platform.
Erin [0:25:37]: Instead, we're fully focused on making sure you can get just in time insights for your product development, business strategy, marketing, and more.
Erin [0:25:45]: Go to user interviews dot com slash awkward to get your first three participants free.
Erin [0:25:50]: Don't thinking about this in my own when I've gone by myself and hashtag Y got it with chat or something.
Erin [0:25:57]: And in some very good at analysis right and and you'll get a lot of things that are true that you're putting through the filter of, I know what I know when is this true.
Erin [0:26:05]: Right?
Erin [0:26:05]: But what you miss is the pain of discovery.
Erin [0:26:08]: The right, like, beating your head on the, we right?
Erin [0:26:11]: And they're...
Erin [0:26:12]: Not to glorify the pain or the inefficiency, but there is something to be said too about working to find those aha and nuggets that I I think some appreciation comes from that too.
Erin [0:26:23]: And if you're choosing between, say, like, three insights that just came so easily or one that you're really had to work but what's she's gonna get used more.
Erin [0:26:33]: I wonder if there's something
Laura [0:26:35]: to that too.
Laura [0:26:35]: There's absolutely under that.
Laura [0:26:36]: And also, like I said just having it be a group.
Laura [0:26:39]: Thing where we all come to it together very much.
Erin [0:26:41]: Yeah.
Erin [0:26:41]: Yeah.
Laura [0:26:42]: But the other thing to I think about is often what it gives you is not insights.
Laura [0:26:45]: Often what it gives you a summary.
Laura [0:26:46]: Right?
Laura [0:26:47]: This is, oh,
Erin [0:26:48]: sure.
Erin [0:26:48]: Sure.
Laura [0:26:49]: Five people it's terrible accounting.
Laura [0:26:50]: So some number of people.
Laura [0:26:53]: We don't really know.
Laura [0:26:54]: Said, x y z.
Laura [0:26:55]: Okay.
Laura [0:26:55]: That's useful.
Laura [0:26:56]: That's data.
Laura [0:26:57]: Right?
Laura [0:26:57]: That's a tidbit.
Laura [0:26:58]: It's a little piece of data.
Laura [0:27:00]: The insights come when you think about how this applies to what we are trying to learn when you apply it to, how does this relate to the research questions that we had when you apply to how does this apply to the product as we know it and the team and what we're trying to do and all of those things taken together And what we have to remember is we know so much about our team, our company, our needs, the stakeholders.
Laura [0:27:31]: Everything.
Laura [0:27:32]: That information is just swimming around in our heads, and we don't even think about it because you can't ask a fish about water.
Laura [0:27:41]: Right?
Laura [0:27:41]: That's just our environment.
Laura [0:27:42]: We know so much about all of these things.
Laura [0:27:44]: The machine knows about the transcript that you just gave it and whatever prompt you gave it.
Laura [0:27:48]: So just to get anything useful out of it, you have to start to give it so much more information than you think you too.
Laura [0:27:58]: To even start to get those kinds of insights.
Laura [0:28:01]: And so is it gonna be as good as a person?
Laura [0:28:04]: The stuff that we've seen that turnout has been fairly high level.
Laura [0:28:09]: Like, patterns?
Laura [0:28:10]: Sure.
Laura [0:28:10]: Yeah.
Laura [0:28:11]: But not useful in the context of what we're trying to actually do and build.
Ben [0:28:16]: And I feel as though, or I believe that the joint idea radiation sessions.
Ben [0:28:21]: We've had many folks come on this podcast and talk about ways that you a researcher or the user experience thinker can get your stakeholders or your partners or your clients together for what you just described an interview or walking through a usability session, not just because it it emphasizes on the process and the sweat equity that goes into delivering a piece of research finding.
Ben [0:28:41]: But also, it it creates a shared understanding.
Ben [0:28:44]: If you're a a human factor engineer and you're designing a new unboxing boxing experience everyone together watching how an unboxing goes to hell.
Ben [0:28:53]: I mean, that...
Ben [0:28:53]: That's really empathize for folks so that the next time you're designing something, as you said.
Ben [0:28:58]: You have a point of reference.
Ben [0:28:59]: And I...
Ben [0:28:59]: Guarantee to your point, I do wonder about these silos of everyone sort of doing their own research.
Ben [0:29:05]: I'm using air quotes again.
Ben [0:29:07]: And then coming together and having these a contextual summaries as you call Laura, not so much an insight.
Ben [0:29:13]: Right?
Ben [0:29:13]: What what is it an insight as a a takeaway plus an opinion or sort of like, a a data point with an angle on it and that angle is really probably what's gonna make Ai efficient or not for your business.
Ben [0:29:23]: It's It's a tool, the tool can't function as well without that context.
Ben [0:29:27]: And you, the researcher or the designer or the product person you bring that context or you should be advocating for that context.
Laura [0:29:33]: Yeah.
Laura [0:29:33]: Absolutely.
Laura [0:29:34]: And and I don't want to come away with this thinking.
Laura [0:29:36]: Well, the class is just how not to use it.
Laura [0:29:39]: Although, I will be honest.
Laura [0:29:40]: There is quite a bit of that.
Laura [0:29:41]: There's a lot of, like, don't use it in this particular situation.
Laura [0:29:45]: It's not great, but there are lots of situations where it really does, like, it can make certain...
Laura [0:29:49]: Like I said, it can make certain things faster.
Laura [0:29:51]: That meta analysis.
Laura [0:29:52]: That project that you would have given to the intern that would have taken them three months over the entire summer to get one thing out of that you're, like, just go and find out how people in these fifteen studies felt about x that we didn't code for because it wasn't important to us.
Laura [0:30:09]: The machine could do that very quickly.
Laura [0:30:11]: And boys is that useful in pointing you toward new research that you might wanna be doing and also preventing you from having to do a bunch of extra research that maybe that's already covered.
Laura [0:30:23]: I mean, we all know you've run a user.
Laura [0:30:25]: You run a usability study or a user research session.
Laura [0:30:28]: You learned so much that you didn't even think you were gonna learn there's so much other stuff people just tell you.
Laura [0:30:35]: I don't know maybe.
Laura [0:30:36]: That's just me.
Laura [0:30:36]: I have people come up and tell stuff all the time.
Laura [0:30:38]: Yeah.
Laura [0:30:39]: I've one of those spaces pillars Like, here's my life story.
Laura [0:30:42]: Great.
Laura [0:30:42]: I wanted to buy a sandwich.
Laura [0:30:44]: But you do, You get all of this information from folks and you don't code for all of that because the study was about how's this feature doing.
Laura [0:30:53]: But that information still exists and you still have it, and boy is it useful and boy can it be useful later, if you're like, I think a bunch of people talked about, this, go find me all the places that they did, and you can kinda do that that overview.
Laura [0:31:08]: Again, wildly totally impossible to do at scale with the kinds of research repo that we used to have and becoming trivial and just outstanding as a tool to use.
Laura [0:31:23]: So there are all kinds of things where it's making things that were formerly impossible possible, and I would like us to focus on those things and not on the things that it kind of sucks.
Erin [0:31:34]: Yeah.
Erin [0:31:34]: Well, let's do that because I think we talked a little bit about the general kinds of things that it can help with, but maybe we could focus on tools and skills.
Erin [0:31:41]: Right?
Erin [0:31:42]: So without getting into specific tools because I know you're not here to show for any tools.
Erin [0:31:46]: Specifically, but categories of tools, kinds of tools, whether Ai has been added into them or a dedicated Ai native tools.
Erin [0:31:53]: What are you seeing being sort of, like that essential stack for research right now, maybe some optional kinds of tools, that really can be helpful for augmenting research.
Erin [0:32:04]: I mean,
Laura [0:32:05]: a lot of what we're seeing right now because we are so early is we are seeing a lot of folks experimenting more with the base models and doing...
Laura [0:32:13]: Like I said, we're seeing some of the more advanced teams create their own custom stuff.
Laura [0:32:19]: And so that's great.
Laura [0:32:21]: We're also seeing some of the stuff that's built into the big repo that those can be really helpful.
Laura [0:32:27]: If all of your information is already in one place, that's the place to go and have the good natural language processing, semantic search just work for you and be able to just hook all that up.
Laura [0:32:40]: We're seeing a lot of the big tools now.
Laura [0:32:42]: Doing the thing where it's not just videos of research.
Laura [0:32:45]: It's also, we're gonna hook it up to your Salesforce instance.
Laura [0:32:49]: We're gonna hook it up to your customer service tickets.
Laura [0:32:52]: Like, we're gonna hook up all that stuff, and then you're gonna be able to search across all that.
Laura [0:32:55]: And that's that's useful we're seeing some of the better analysis tools tend to be the ones that are custom built for research analysis because there is more of an understanding of sort of what matters and what doesn't.
Laura [0:33:11]: But all of everything I'm saying could be different tomorrow, and I just want folks.
Laura [0:33:16]: This is what I want for folks.
Laura [0:33:18]: I'm for tools.
Laura [0:33:19]: I'm gonna just kinda put my foot down and say, this is how you should approach tools.
Laura [0:33:22]: Great stop looking at every goddamn thing that shows up in your inbox and saying, should we use this or should we not use this?
Laura [0:33:28]: Is this gonna solve all of my problems?
Laura [0:33:30]: It will not.
Laura [0:33:31]: I will tell you right now it will not solve all of your problems.
Laura [0:33:34]: None of them do.
Laura [0:33:36]: They all have flaws, they all break at some point.
Laura [0:33:40]: They all have corner cases just like every other product that was uses.
Laura [0:33:43]: So think strongly about your team, think about your process, think about where the bottlenecks are.
Laura [0:33:52]: Do you need to go faster?
Laura [0:33:53]: Do you need to be more thorough?
Laura [0:33:55]: Do you have a more junior team that you need to upscale.
Laura [0:33:59]: What does better and I I did write a whole book called Build Better products where I spend a lot of time talking about What is better and what is better is better just faster because you can cut a lot of corners here and get faster if you're willing to sacrifice some quality in some areas.
Laura [0:34:20]: And on some studies, that's fine.
Laura [0:34:22]: That's perfectly reasonable.
Laura [0:34:24]: Right?
Laura [0:34:25]: But you have to sort of ask yourself, What am I trying to get out of this and then go out and look for the tools that will improve things and we'll actually solve specific problems.
Ben [0:34:35]: And we talked a bit about skills like inventory and you've given some questions.
Ben [0:34:39]: Are there other skills that, let's say in week to and on of the class or pardon instance to because you've just went through your first cohort.
Ben [0:34:47]: You're about to get your second cohort are there skills that you are gonna prioritize in this second cohort going on or things that you didn't think about like, oh, man.
Ben [0:34:55]: I really need to review this skill with folks who are looking to adopt and use Ai for research.
Ben [0:35:00]: What should be...
Ben [0:35:01]: We be on the lookout for are we thinking about re skills and Ai?
Laura [0:35:04]: One of the things that we ended up talking about and that is going into the official class because I think it is so so important is one of the things that you already asked me about, which is how do I get leadership to stealth passing me.
Laura [0:35:17]: About including Ai absolutely everywhere and replacing the entire team and blah blah.
Laura [0:35:22]: So we absolutely need to have some kind of...
Laura [0:35:26]: Like I said, we talked to about a little, but I wanna do a little bit more of a deep dive for folks so that they are comfortable saying this is what to actually say to somebody who says, well, why can't we just do a synthetic user focus group.
Erin [0:35:39]: Mh.
Laura [0:35:40]: Other than just rolling your eyes and, you know, storing out, which is always my first impulse.
Laura [0:35:44]: These are really important things that we need to provide people?
Laura [0:35:48]: We also need to provide people with the ability to say, what are we trying to get out of it?
Laura [0:35:52]: Are we trying to go faster?
Laura [0:35:53]: Or are we trying to have fewer researchers?
Laura [0:35:56]: Are we trying to do a better job?
Laura [0:35:58]: What is the goal and to advocate for the tools that work and to advocate for not the tools that don't work.
Erin [0:36:05]: Yeah.
Erin [0:36:05]: You mentioned it depends what your your goals are.
Erin [0:36:07]: Let's say, I know a lot of research teams are trying to move faster or...
Erin [0:36:12]: Maybe to put a finer point on it to move at the speed of product development.
Erin [0:36:15]: Right?
Erin [0:36:15]: So sometimes research is discovery and it's strategic and it is slow and it's meant to cover a long period of time for now, but a lot of research, there's pressure to move faster to move with the speed of product development.
Erin [0:36:27]: Are there categories of tools or use cases that you have found to be particularly effective toward?
Laura [0:36:35]: Yeah.
Laura [0:36:35]: I mean, I think we're recruiting people is getting much much much faster than it I used to be.
Laura [0:36:40]: I remembered fifteen years ago, the recruiting process could take two, three weeks and we're just not seeing that anymore, making sure that people who are writing their sort of, like, custom bots to do things it can turn out a draft research plan after asking you a few questions and you answer a few questions, and it can give you a draft that you can then edit and that can take you an hour versus maybe a day.
Laura [0:37:06]: And so we're seeing kind of places like that.
Laura [0:37:10]: There are a couple of places that we do actually recommend things like Ai moderators, where you just can't get these people scheduled because they're a possible to schedule.
Laura [0:37:20]: But if they can do it on their own time, they can do it on their own time or in cases where, like, well, we just need to interview forty people because that's what somebody said we had to do Like, okay.
Laura [0:37:32]: Great.
Laura [0:37:32]: Get your initial insights and then let the machine make interview the other...
Erin [0:37:37]: The other thirty two people.
Erin [0:37:37]: Because sometimes you need to check a box.
Laura [0:37:39]: Yeah.
Laura [0:37:39]: Exactly.
Laura [0:37:39]: Some of it is going to reduce quality.
Laura [0:37:43]: Some of it.
Laura [0:37:44]: Not all of it.
Laura [0:37:44]: Some of it.
Laura [0:37:45]: Like I said, makes things possible.
Laura [0:37:46]: Some of it is going to reduce certain kinds of quality.
Laura [0:37:49]: Sometimes as that's okay.
Laura [0:37:51]: As long as you are making that decision intentionally and not thinking my big list of Linkedin influencer prompts are going to make everything at the level of a super senior researcher.
Laura [0:38:02]: Great.
Laura [0:38:03]: None of that is true, obviously.
Laura [0:38:05]: And as long as we all admit that, we can kinda go, well, but we don't need that on this one, we kind of know what we're gonna hear and we need to do a quick check.
Laura [0:38:15]: Okay.
Laura [0:38:16]: Do what you need?
Erin [0:38:17]: And because there is so much, like, is it better than nothing?
Erin [0:38:20]: Is it and there's these, like, infinite sort of trade offs Right?
Erin [0:38:22]: In spectrum of different...
Erin [0:38:23]: There's always been different methods you could use.
Erin [0:38:26]: Has always been trade offs between speed and rigor and research Ai just adds a new vector to that in a lot of ways.
Laura [0:38:32]: Right?
Laura [0:38:32]: Fast cheap and good.
Laura [0:38:33]: Pick two.
Erin [0:38:34]: Yeah.
Erin [0:38:34]: Which do you want?
Erin [0:38:35]: Yeah.
Erin [0:38:35]: And and so I don't know.
Erin [0:38:37]: Are there things maybe to look out for on the back end of, like, have I gone too far when way or the other right fruit.
Erin [0:38:42]: So are people still using my research.
Erin [0:38:45]: Right?
Erin [0:38:46]: Have I lost credibility I made too many compromises.
Erin [0:38:49]: Right?
Erin [0:38:49]: Are there are things to kinda look out for to gauge?
Erin [0:38:52]: Am I making the right balance of of these things?
Laura [0:38:56]: Yeah.
Laura [0:38:56]: I mean, at the point that you are sort of just turning over this list of insights or whatever very quickly, but they're just getting ignored.
Laura [0:39:03]: Maybe that's a signal that we all need to do that together and learn things together.
Laura [0:39:09]: As opposed to, is the team moving on without me because I'm taking seven weeks to do a simple prototype study.
Laura [0:39:17]: Well, okay, that's also not getting used.
Laura [0:39:20]: Right?
Laura [0:39:21]: Those are both dev research, I think in a in different way.
Laura [0:39:25]: And so looking at...
Laura [0:39:28]: I mean, and I agree with you a hundred.
Laura [0:39:29]: Right?
Laura [0:39:30]: Like, we've always sort of said, can we do this faster and more scrappy, and the answer is sometimes absolutely, and sometimes really not here.
Laura [0:39:37]: Like, this this is actually gonna slow things down long term if we don't get this fundamental stuff right.
Laura [0:39:43]: So, like, let's not let people go off and build things based on vibes because as they are want to do.
Laura [0:39:50]: But that doesn't mean that we always have to do things in two days.
Laura [0:39:55]: I don't want something useless in two days?
Laura [0:39:58]: But can I get something pretty good in a week?
Ben [0:40:02]: So it sounds like Laura at the end of this first go around with the class because you wouldn't teach a class if you didn't think Ai.
Ben [0:40:07]: Was here to stay in the research process.
Ben [0:40:09]: It sounds at least to me is reflecting on this conversation today.
Ben [0:40:13]: I've hearing a lot about inventory your processes, getting comfortable with your business that's advice we've heard on this pod a lot about impact on Roi.
Ben [0:40:21]: It sounds like Ai is best thought of as another tool granted a one with a lot more ep around it and it's got this ubi that can sometimes make it feel like, oh my gosh.
Ben [0:40:30]: I have to use it or else.
Ben [0:40:31]: But just to gut check the advice generally that you're sharing.
Ben [0:40:35]: Know your business, know the research questions that you want to answer, know the insights that you need to provide and then find a way where Ai won't break too much.
Ben [0:40:43]: Imagine the researcher you're listening today who is maybe hesitant about adding Ai.
Ben [0:40:48]: What's your advice to them about maybe next week.
Ben [0:40:51]: What might they do if they wanna add Ai or think about Ai or just report with their skip level.
Ben [0:40:56]: Here's what I'm doing re Ai?
Ben [0:40:58]: To their senior leader.
Laura [0:41:01]: Yeah.
Laura [0:41:01]: There are a lot of different directions that people can go here.
Laura [0:41:04]: I would say some of it is, you know, sort of just getting comfortable with the concepts behind Ai, understanding what things they do need to look out for.
Laura [0:41:13]: Read things that are both very pro and very cons so that they know what to look at.
Laura [0:41:19]: It always makes me nervous when I hear anybody who's only reading the stuff that's very negative or very positive.
Laura [0:41:24]: Because I just...
Laura [0:41:26]: I want them to have a more balanced view of this is a place again where you could use it, and this is a place where you need to be careful and maybe use it differently or better, do figure out that, but here...
Laura [0:41:40]: But you you were spot on.
Laura [0:41:41]: Right?
Laura [0:41:42]: It's a tool.
Laura [0:41:43]: It's a set of tools.
Laura [0:41:44]: It's a technology, so it's more than just like one product, but it is a tool and we're all kind of figuring out what it's good at and what is bad.
Laura [0:41:54]: Now I will say the one thing that I just want everybody to understand about this particular set of tools, the one thing, the most dangerous thing that it is good at is it makes things in the form of other things.
Laura [0:42:07]: An Imp.
Laura [0:42:08]: Yeah.
Laura [0:42:09]: Yeah.
Laura [0:42:10]: It is very good at making a thing that looks like an insight.
Laura [0:42:14]: It is very good and making a thing that looks like desk research.
Laura [0:42:16]: Do only think it's bad at making is slides, which frustrate me to know in because I hate.
Laura [0:42:22]: Oh, my god.
Laura [0:42:23]: I can't maybe smoke so much.
Laura [0:42:25]: That is the one thing that I would pay a million dollars for.
Ben [0:42:28]: Oh, man.
Erin [0:42:30]: I have seen so many talk about those five prompts that'll change your life.
Erin [0:42:33]: I've seen so many.
Erin [0:42:34]: Like, this is how you're gonna make a great slide deck with Ai and they're trash.
Laura [0:42:38]: Man.
Laura [0:42:38]: Right.
Laura [0:42:39]: They're just just hot garbage.
Laura [0:42:40]: I'm sorry.
Laura [0:42:41]: And if anybody out there listing is like, no No.
Laura [0:42:43]: I have the one for you.
Erin [0:42:44]: So.
Erin [0:42:44]: Yeah.
Erin [0:42:45]: Let's see.
Erin [0:42:45]: And you send it
Laura [0:42:46]: to me, I will send you a box of chocolate.
Laura [0:42:49]: Like, this...
Laura [0:42:50]: Oh my god.
Laura [0:42:50]: Like, I absolutely would love it.
Laura [0:42:52]: So anyway, but with the one exception of the thing that I want.
Laura [0:42:55]: It is very, very good at making things that look like other things, which is why we have to be so careful about saying, okay.
Laura [0:43:02]: This looks like an insight?
Laura [0:43:03]: Is it an insight Is it a useful insight?
Laura [0:43:06]: Is it a true insight?
Laura [0:43:07]: Is it the same or better insight than what I would have come up with on my own?
Laura [0:43:13]: Right?
Laura [0:43:14]: So just making sure whatever you're doing, whatever you go and try it with?
Laura [0:43:20]: It's very easy to look at it and go, look at this...
Laura [0:43:22]: Look at all of this information it's given me.
Laura [0:43:24]: That's great.
Laura [0:43:25]: Just check it.
Laura [0:43:26]: Just make sure it's not hall hall or just wrong or just pulling data that you don't necessarily agree with.
Laura [0:43:33]: And in that case, Yeah.
Laura [0:43:34]: Like I said, you really have to sort of know what you're expert in and know what you need to bring an expert in on.
Laura [0:43:39]: Right?
Laura [0:43:40]: If you're learning a whole lot of stuff about, I don't know physics, maybe.
Laura [0:43:43]: I if I'm learning anything at all about physics, need to bring somebody in who maybe could spot check that.
Erin [0:43:51]: Yeah.
Erin [0:43:51]: Same thing.
Erin [0:43:51]: Believe whatever it
Laura [0:43:52]: tells me.
Laura [0:43:53]: I don't know.
Laura [0:43:53]: It could all it's all just magic.
Erin [0:43:57]: Awesome.
Erin [0:43:57]: I think that's a great place to move to our concluding the rapid fire section here.
Erin [0:44:03]: So we've talked about research questions, what is your favorite research interview question as in to ask a participant or research responded?
Laura [0:44:13]: I mean, the one that I that I used the most is just tell me more about that.
Erin [0:44:17]: Yeah.
Erin [0:44:17]: Yeah.
Laura [0:44:18]: Not to be incredibly boring there.
Laura [0:44:20]: Yeah but honestly, like, again, research questions are so incredibly contextual to whatever I'm trying to learn, but the one that I keep coming back on is just like...
Laura [0:44:28]: Tell me more about that.
Laura [0:44:30]: Right?
Laura [0:44:30]: Or my other favorite is, you know, what's x about it.
Laura [0:44:33]: Right?
Laura [0:44:33]: And they're like, oh, that's so cool.
Laura [0:44:35]: That's so interesting.
Laura [0:44:35]: That's so, what's cool about that.
Laura [0:44:38]: What's useful about that?
Laura [0:44:39]: What's interesting.
Laura [0:44:39]: And then suddenly, they're telling you things that are actually actionable and useful whereas...
Laura [0:44:43]: Oh, I love that.
Laura [0:44:44]: Why?
Laura [0:44:45]: Yeah.
Laura [0:44:47]: Then that's when you get to the interesting stuff.
Erin [0:44:50]: Yeah.
Erin [0:44:50]: Yeah.
Erin [0:44:51]: Two or three resources, you most recommend to others.
Erin [0:44:54]: I feel very strongly that I'm gonna
Laura [0:44:57]: have to recommend all of the energy stuff.
Laura [0:44:58]: One of the reasons that I joined Nielsen Army group is I like the material.
Laura [0:45:02]: I am extremely picky.
Laura [0:45:03]: I very rarely read anything by anybody with a few exceptions, but where I'm not kinda like, that's mostly true, but let me tell you all the places where I disagree.
Laura [0:45:10]: And I just felt like, they put out very solid, very useful stuff for kind folks at all the different levels.
Laura [0:45:17]: So absolutely that.
Laura [0:45:19]: I'm actually not gonna recommend my two books because they're old.
Laura [0:45:22]: You to write another one.
Ben [0:45:24]: That sounds like it's not up updated.
Laura [0:45:25]: Wow.
Laura [0:45:25]: Wow.
Laura [0:45:26]: Oh, You chose violence.
Laura [0:45:27]: No.
Laura [0:45:28]: No.
Laura [0:45:29]: Absolutely not.
Laura [0:45:30]: Thank you though.
Laura [0:45:31]: But, yeah.
Erin [0:45:31]: We're making Ai to till I don't know.
Erin [0:45:33]: What is the twenty twenty five version of a a book.
Ben [0:45:36]: Oh, jeez.
Erin [0:45:37]: I'll do the bin thing.
Erin [0:45:37]: A book book.
Ben [0:45:39]: Yeah.
Ben [0:45:39]: I'm using quotes about a book.
Ben [0:45:40]: A book is a paper based or.
Erin [0:45:44]: Well, where can folks?
Erin [0:45:45]: Follow you find you.
Laura [0:45:47]: Well, I'm on Linkedin.
Laura [0:45:47]: I'm also blue.
Laura [0:45:49]: I don't talk about design stuff there.
Laura [0:45:51]: So there's really no reason to.
Laura [0:45:52]: But, yeah, I'm on Linkedin, and I am teaching courses.
Laura [0:45:55]: I will be teaching several more of the cohorts of this over the summer, and I believe I have another class that I am not going to mention yet because it is not set in stone.
Laura [0:46:04]: It was another class that will be coming up sometime in the fall on a totally different topic Wonderful.
Laura [0:46:09]: I'm just teaching these days.
Laura [0:46:11]: That's that's all
Erin [0:46:12]: I do.
Erin [0:46:12]: Awesome.
Erin [0:46:13]: Well, we can add that new course to the show notes when it has been revealed, which we will definitely do.
Erin [0:46:18]: I think this has gonna mark, like, twenty episodes before since someone has mentioned follow me on Twitter.
Erin [0:46:23]: I think it's done.
Erin [0:46:24]: Mh.
Erin [0:46:24]: Over.
Erin [0:46:25]: Yeah.
Erin [0:46:25]: Yeah Yeah.
Erin [0:46:26]: It's Yeah.
Laura [0:46:27]: I've been off it for a couple of years.
Laura [0:46:28]: So it's
Erin [0:46:29]: it's my be the first blue sky.
Erin [0:46:29]: But I think we're...
Erin [0:46:30]: I think Twitter is officially...
Laura [0:46:31]: I, I I kinda use that for.
Laura [0:46:33]: Yeah.
Laura [0:46:33]: It's not...
Laura [0:46:34]: Which is super sad because I really liked it.
Erin [0:46:37]: At the heyday, it was
Laura [0:46:39]: super good.
Laura [0:46:39]: Was so good.
Laura [0:46:40]: Design Twitter was so great.
Laura [0:46:42]: There it
Erin [0:46:42]: was.
Erin [0:46:42]: It was.
Erin [0:46:43]: Rest and peace.
Erin [0:46:45]: Alright.
Erin [0:46:46]: Well, Laura, Thanks so much.
Erin [0:46:47]: I pleasure and excited for your future and Ng classes and the Ai one and all the great things to come.
Erin [0:46:54]: We fun to check on web.
Erin [0:46:56]: Thank you so much.
Erin [0:46:56]: It was lovely.
Erin [0:46:57]: Alright.
Erin [0:46:57]: Bye bye.
Erin [0:46:58]: Bye.
Erin [0:46:58]: Thanks for listening to Awkward Silencesbrought to you by user interviews.
Erin [0:47:08]: Be music by fragile gang.
Erin [0:47:10]: Hi there, Awkward Silencesis listener.
Erin [0:47:23]: Thanks for listening.
Erin [0:47:24]: If you like what you heard, we always appreciate a rating or review on your podcast app choice.
Erin [0:47:30]: We'd also love to hear from you with feedback, guest topics or ideas so that we can improve your podcast listening experience.
Erin [0:47:36]: We're running a quick survey so you can share your thoughts on what you like about the show, which episodes you like best, which subjects you'd like to hear more about Which stuff you're sick of and more just about you.
Erin [0:47:46]: The fans have kept us on the air for the past five years.
Erin [0:47:48]: We know surveys usually suck.
Erin [0:47:51]: See episode twenty one with Erica alcohol for more on that.
Erin [0:47:54]: But this one's quick and useful.
Erin [0:47:55]: We've promise.
Erin [0:47:56]: Thanks for helping us make this the best podcast it can be.
Erin [0:48:00]: You can find the survey link in the episode description of any episode or head on over to user interviews dot com slash awkward survey.
Episode Video
Creators and Guests