
#173 - Prototyping AI in Live Research with Bold Insight & Panasonic Well
Gavin [0:00:00]: Wrote a book not the logo called AI and UX, and it's context, interaction and trust.
Gavin [0:00:04]: That's what we've discussed here.
Gavin [0:00:06]: People are revealing things.
Gavin [0:00:08]: That is the context.
Gavin [0:00:09]: How you interact or...
Gavin [0:00:11]: That's gonna be a lot that goes into whether I see this as good or bad or successful, but it's all building towards trust.
Gavin [0:00:17]: And if we have that mindset of Mvp, let's just get it out there and see what happens, that trust will not be assured.
Erin [0:00:29]: Hey.
Erin [0:00:29]: This is Erin May, and this is Carol Guest.
Erin [0:00:31]: And this is Awkward.
Erin [0:00:33]: Silence is.
Erin [0:00:34]: Awkward
Gavin [0:00:37]: Zion this is
Erin [0:00:37]: brought to you by user the fastest way to recruit targeted high quality participants for any kind of research.
Erin [0:00:43]: Hello, everybody, and welcome back to Awkward silence.
Erin [0:00:52]: Today we're here with not one not two, but three guests very exciting.
Erin [0:00:56]: We've got Larry Becker, director at Bold Insight.
Erin [0:01:00]: We've got Gavin Lew managing partner at Bold Insight, and we also have Katie Johnson, Global Head of Research at Panasonic Well.
Erin [0:01:07]: So excited to have you all here today?
Erin [0:01:10]: We're gonna talk about prototyping Ai experiences within live research.
Erin [0:01:15]: Maybe to get started, could you tell us a little...
Erin [0:01:18]: How do you know each?
Erin [0:01:18]: How do you work together?
Larry [0:01:20]: Well, many, many years ago, and right before web two point o, I was running e commerce for a large consumer electronics company, and they brought a consultant in to teach us about something called user experience and usability.
Larry [0:01:36]: So since the late nineties early ops.
Larry [0:01:39]: So I was pete Gavin was that consultant.
Larry [0:01:42]: And I learned so much in that engagement Gavin and I, our careers evolved.
Larry [0:01:49]: We did different things.
Larry [0:01:50]: And then about five years ago, I read a book that had great impact to me.
Larry [0:01:57]: It was a book by someone you may have had on this podcast India and India was talking about practical empathy and the kind of empathy researchers need, distinguishing affected empathy from cognitive empathy.
Larry [0:02:13]: The time I was freelancing and I had great clients, but many of them were from retail.
Larry [0:02:17]: Or can retail folks.
Larry [0:02:18]: And, they can be really transactional.
Larry [0:02:21]: So I would be like, hey.
Larry [0:02:23]: Let's talk about empathy and, they be like, hey.
Larry [0:02:25]: Can't we just do free shipping again.
Larry [0:02:26]: So I knew what I needed to talk to somebody, so I called up Gavin, and we reconnected, and I joined his team, and my very first client was Katie Johnson, and we started prototyping experiences from Day one.
Larry [0:02:41]: So that's my origin story.
Larry [0:02:43]: If I put this trio, but, welcome other perspectives on that.
Erin [0:02:47]: Can me anyone corroborate this story?
Gavin [0:02:50]: It happens Yeah.
Gavin [0:02:51]: About twenty five years ago.
Gavin [0:02:52]: Yeah So I will say that it is a truth
Larry [0:02:55]: I was seven Gavin was five.
Gavin [0:02:58]: Exactly were little kids.
Erin [0:02:59]: We all were.
Gavin [0:03:00]: So I mean, I started the company way back in the nineties called user centric.
Gavin [0:03:03]: They're the concept of vu x wasn't really a thing was more usability maybe.
Gavin [0:03:09]: It was more human factors.
Gavin [0:03:11]: And, you know, along that way, I sold the company to a big market research firm, so I learned a little bit about market research.
Gavin [0:03:18]: What goes into a business case for new features.
Gavin [0:03:21]: And then I started bold insight, which is really primarily a research team.
Gavin [0:03:26]: So we took away the Ux design part?
Gavin [0:03:28]: We said, you know, there's a lot of product teams that have that in house?
Gavin [0:03:31]: How do we really focus on understanding what are the real problem statements?
Gavin [0:03:38]: How do we explore Z?
Gavin [0:03:40]: How do we form do formative research to make the thing that they want better and hit the mark.
Gavin [0:03:46]: And along the way, Katie Johnson called us up when she was working in a really big company and said, I have a really strange question to ask and it's all about how people can adopt new technology that doesn't yet exist.
Gavin [0:04:01]: And what are the different research techniques to do so?
Gavin [0:04:04]: And with without katie.
Katie [0:04:06]: Yeah.
Katie [0:04:06]: From my point of view.
Katie [0:04:07]: So, you know, I'm a little bit younger than both of these guys.
Katie [0:04:11]: So twenty five years ago, I was not there.
Katie [0:04:13]: Just a little bit.
Katie [0:04:14]: Just a little bit.
Katie [0:04:15]: But, yeah, you have to picture from my point of view, so I also come from a human factors, I'm technically actually a dual degree engineer that's been mas around as a consumer insights kid for the last fifteen years.
Katie [0:04:27]: But in twenty twenty, I found myself at a really weird kind of moment in time as many of us did.
Katie [0:04:35]: Right, back then.
Katie [0:04:36]: I was a new mom.
Katie [0:04:37]: I was a month into working at Google, which was wild at what felt, like, you know, the pinnacle of my career, and I realized that I needed some serious help we were building zero to one products on the assistant.
Katie [0:04:50]: And Google, of course, had pre added vendors.
Katie [0:04:53]: Right?
Katie [0:04:54]: So you couldn't talk to anyone that wasn't vetted already at Google, which is great.
Katie [0:04:58]: I mean, we have app they have absolutely world class vendors that they deal with.
Katie [0:05:03]: But, of course, like, every person that hires a vendor has their own style.
Katie [0:05:07]: Right?
Katie [0:05:07]: So I I interviewed three.
Katie [0:05:09]: I hadn't worked at Google before.
Katie [0:05:10]: So I wanted to make sure I was doing a really good job, and I interviewed three different vendors.
Katie [0:05:14]: And the interview process that I put together was long.
Katie [0:05:17]: I think we did, like, three or four rounds of interviews with each of the three.
Katie [0:05:22]: And Gavin, and I just...
Katie [0:05:24]: It was, like, finding someone that spoke your language.
Katie [0:05:26]: Like, I needed someone who appreciated engineering the way that I do as effectively at this point a Rookie engineer, someone who appreciates product strategy, the way that I do and someone who appreciates, you know, users the way that I do.
Katie [0:05:38]: And so meeting Gavin was it was really, like, this is it.
Katie [0:05:42]: And so I bid a due to the other two vendors and hired Gavin and immediately met Larry.
Katie [0:05:48]: And Larry has been my...
Katie [0:05:50]: You know, he hates the word coworker.
Katie [0:05:52]: My colleague effectively for not having worked with the same company.
Katie [0:05:56]: He's been my colleague for the last five years at two companies.
Katie [0:05:59]: We have been the closest collaborators of anyone I have the privilege of working with.
Katie [0:06:03]: So it's been a really wonderful ride to get to work with Gavin and Larry, specifically, Larry, you know, in the in the trenches, at two different companies over the last five years.
Erin [0:06:13]: Well, this is wonderful.
Erin [0:06:13]: I'm so glad I ask.
Erin [0:06:14]: So that's a a lot of great background for everybody as we get into the nuts and bolts of this conversation, you know, on who we're talking to and how you all know each other in half for for a long time.
Erin [0:06:23]: So alright.
Erin [0:06:24]: Let's jump into things.
Erin [0:06:25]: Alright.
Erin [0:06:25]: So you talk about bringing prototype peaks experience mindset to your research?
Erin [0:06:30]: What do you mean by that?
Erin [0:06:31]: And why is it important for Ai experiences in particular?
Larry [0:06:36]: I take a crack at the first part.
Larry [0:06:38]: So as they're recovering marketing and commerce Vp, I don't mind being very on the nose at first?
Larry [0:06:45]: And nothing I against marketing would be paid spot.
Larry [0:06:47]: Don't I?
Larry [0:06:48]: There big one.
Larry [0:06:49]: So if you think about, like, from a business standpoint, how do you justify an investment in Ux research?
Larry [0:06:57]: Why would you do that anyway?
Larry [0:06:59]: For me, I have always thought about it as well.
Larry [0:07:04]: I want to reduce risk and whatever I'm building, and I want to surface opportunities whatever I'm building, and I want to do so as cost effectively as possible when I was on the in client side.
Larry [0:07:18]: So in cases where the technology will be built later in cases where the app is still under development, you start sort of stepping backwards into more of a low five mindset.
Larry [0:07:32]: And in early early days, like, you know, again, back when gavin and I were five and seven, you know, you would do paper prototyping.
Larry [0:07:39]: Right?
Larry [0:07:40]: Because you were prototyping screens.
Larry [0:07:42]: And as the surfaces that we're interested in have evolved, just short paper prototyping is probably archaic in some ways, but that mindset of what is the level of fidelity at which I can get maximum Roi on my research efforts, move towards lower, and it's not just an app screen.
Larry [0:08:04]: It's not just the controls on a phone or a smart home device there are experiences that are being imagined that we're going to have and have begun having with Ais with Ll that are increasingly part of how we get things done and how we talk to each other.
Larry [0:08:24]: So really on one level, and Katie is much more articulate about this part than I am thinking in the pass for a minute.
Larry [0:08:31]: What we've been doing for, like, five years in different settings is is really creating these experiences with various degrees of simulation and real product and always real participants to understand how do you optimize and experience ideally be or the most costly resources are deployed.
Katie [0:08:55]: I think that we're we're living in a very...
Katie [0:08:56]: I'll use not the official term, but experiment and find out season of life.
Katie [0:09:02]: There are other phrases that folks use when they...
Katie [0:09:05]: When I talk about that.
Katie [0:09:06]: Right?
Katie [0:09:07]: But we're living in a season of life where there's a lot of ability to just try things.
Katie [0:09:13]: Right?
Katie [0:09:13]: Which is really exciting back to paper prototyping.
Katie [0:09:15]: I think vibe coding and coding with L really democrat access to the ability to build things that is really, really exciting.
Katie [0:09:25]: Even for folks like me who know or learned a long time ago how to code and are very, very bad at it, it's beautiful time to be alive because you don't have to debug as much as you had to back then.
Katie [0:09:35]: Right?
Katie [0:09:35]: And I think that yields kind of a desire to just say, oh, ship it, and let's just see what happens.
Katie [0:09:40]: And at the same time, what's happening when we say that is we're living in a world that is no longer the world that we live in when we say that.
Katie [0:09:47]: We grew up all three of us.
Katie [0:09:49]: Go all four of us grew up in a world where product development was all about building deter products, it was about pushing pixels, and those pixels were gonna be the same when Aaron opened the app and when Gavin opened the app and when Larry opened the app, and when Katie opened the app.
Katie [0:10:03]: And so the principles the heuristic of product design and specifically in Ux research usability come from a world in which we know exactly what we're building, and we can control it.
Katie [0:10:13]: And the reality is now we live in a world where we have seated control.
Katie [0:10:17]: And in a way that's very scary, and in a way that's very exciting because now, not only is the app that Katie opens different from the app that Gavin opens because Gavin and Katie are using it differently, but the app, katie opens today and the app Katie opens tomorrow aren't the same app either.
Katie [0:10:33]: Because that app knows me differently tomorrow than it did today.
Katie [0:10:36]: Right?
Katie [0:10:37]: And so we're living in a world where although we don't do paper prototyping as Larry suggests anymore.
Katie [0:10:43]: We really are doing paper prototyping in simulation That is going to give us the answers about what s sarcastic product experiences feel like to folks over time, which is a completely new discipline, and we are at the dawn of being able to invent a new discipline to build products and understand more importantly how people use those products over time.
Katie [0:11:04]: So it's a really exciting time for the discipline if we are willing to shed the orthodox of deter product development that no longer serve us, which I think this team really really is.
Gavin [0:11:15]: And I would add what Katie just described is it's not the dot com era that Larry talked about web two dot or which web one dot o.
Gavin [0:11:24]: It's not about doing that Mvp just to get acquisition in numbers.
Gavin [0:11:28]: What Katie's referring to is a technologies that's evolving rapidly, but they're dealing with humans and people.
Gavin [0:11:36]: So if I ask it to do something, it says, sorry, I don't know how to do that.
Gavin [0:11:40]: I'm never gonna ask again.
Gavin [0:11:42]: Even though it may figure it out over time.
Gavin [0:11:45]: So that first impression, when we think about, oh, let's just get it out there and have Mvp, you can completely destroy the market.
Gavin [0:11:53]: And you can have a lot of negative impact.
Gavin [0:11:57]: I mean, just think about siri, how that really was wonderful, but how many people use Siri today, but what's more scary is how many people use Cor ton?
Gavin [0:12:09]: For Microsoft or Bixby from Samsung.
Gavin [0:12:13]: The experience that people have on new technologies when it doesn't always work the way you expect it, yet it evolves can actually really ruin the the situation for other companies that may have incredible technologies.
Gavin [0:12:27]: And so that's where prototyping is really important not just to see if it's works.
Gavin [0:12:33]: If it's something comprehend.
Gavin [0:12:35]: But can you anticipate those problem statements so you can actually have a product that's differentiated?
Larry [0:12:42]: I think that's right.
Larry [0:12:42]: And you guys are making me think of of another dimension particular to the work, Aca, and I focused this on in the last couple of years.
Larry [0:12:50]: So everything that Gavin just talked about the stakes of early interaction.
Larry [0:12:54]: Certainly, Katie and I saw that when we're were working together on her work.
Larry [0:12:58]: I had, you know, it was assistant.
Larry [0:12:59]: So, you know, you're gonna ask this thing to help you get things done if it doesn't understand it or it screws up, probably not gonna ask you it to yet.
Larry [0:13:06]: However, all these things about, you know, sarcastic, deter versus non deter products, that that Katie was describing so beautifully.
Larry [0:13:14]: When the stakes are as it has been in our recent work.
Larry [0:13:19]: Wellness and forget the gwyneth Pal sort of neon opacity that that that word may have but really, like Aaron, How are you feeling?
Larry [0:13:29]: How is your last couple of weeks been, you know, What's the highlight?
Larry [0:13:34]: What's a low?
Larry [0:13:35]: How's work family balance and duration.
Larry [0:13:37]: Those are the questions that Katie and I have been asking families is literally now for years and go back to Gavin point about one false move?
Larry [0:13:47]: Yeah.
Larry [0:13:49]: They might not engage with the product anymore.
Larry [0:13:52]: But when the stakes are emotional well being and maybe even even physical well being, there's even more of an incentive to get it right, which is why We were so much over method design.
Larry [0:14:08]: And, like, a Aca and I described to you, the method we used, you know, most recently, on what level, it sounds like, how, you guys had a couple of Dixie cups and a piece of strain, you know, because it is kinda low five, but it is very, very carefully thought out to yield maximum insight while protecting human subjects and that, you know, the spoiler alert is what we are going to talk about, so this is Already spoke about at epic at the epic conference.
Larry [0:14:35]: We've all collaborated on a paper about this particular work.
Larry [0:14:38]: Especially was katie to describe.
Erin [0:14:41]: You've you've all hit on so many sort of, like, duality use paradox that I think about a lot when I think about Ai Right?
Erin [0:14:46]: So, like, for example, bringing it back to basics with the paper prototype even as this technology is becoming more advanced unless the deter all the time.
Erin [0:14:57]: Right?
Erin [0:14:57]: Like, some of those sort of first principles of humans and basic prototypes are more relevant than other that as this technology moves faster and faster and faster, getting it right.
Erin [0:15:08]: The first time is more and more important.
Erin [0:15:10]: Right?
Erin [0:15:10]: You can't just throw stuff at the wall and see what sticks.
Erin [0:15:12]: These humans, customers are gonna make an impression, and you wanted to be a really good one.
Erin [0:15:18]: Right?
Erin [0:15:18]: Because there's this arms race of Ai happening, and that's important.
Erin [0:15:22]: So lots of good insights to get us kicked off here.
Erin [0:15:25]: I think it'd be fun to dig into some examples.
Erin [0:15:27]: So you're bringing the Lu five prototype mental model into a lot of your current research I know.
Erin [0:15:33]: Where would you like to jump in.
Erin [0:15:34]: What what stories do you wanna until first?
Gavin [0:15:37]: Back what I met?
Gavin [0:15:38]: Larry or probably a few years after that, we're working with Verizon Wireless.
Gavin [0:15:42]: And for those who are listening, this has all been published, there was a lawsuit against Verizon Wireless because a blind woman named Bon Day.
Gavin [0:15:51]: Suit and said you don't have a phone that really works for blind people.
Gavin [0:15:57]: And of course, Verizon went to court and said, but we don't make phones.
Gavin [0:16:01]: She won the lawsuit because the jury felt that Verizon really wanted to make a phone.
Gavin [0:16:08]: Manufacturers would.
Gavin [0:16:10]: So we've been working with Verizon on a lot products, both in store on device instructions, all these kind of things.
Gavin [0:16:16]: And the sponsors asked me to do some a lot of research.
Gavin [0:16:20]: We ended up doing a program of research with over two hundred people who are blind and low vision to understand how to create an experience that didn't not exist for voice only.
Gavin [0:16:33]: But one of their challenges and think about what we're talking about today Ai.
Gavin [0:16:36]: Well, back in this period, it was speaker recognition, which is a precursor of what a lot of the things we're were talking about.
Gavin [0:16:44]: And that is the engine itself might make mistakes.
Gavin [0:16:48]: We're testing whether or not somebody can go through the the product flows, the call flows, add john to your phone book, for example.
Gavin [0:16:56]: How do you do that without using the screen, but only voice.
Gavin [0:17:00]: So we actually would do kind of a wizard of oz where where researchers had a prototype, which would they press buttons based on what the participant said, and if they would say, please say Command.
Gavin [0:17:11]: And the person say call John.
Gavin [0:17:13]: And then they would press the right buttons and they would speak back out.
Gavin [0:17:16]: So we had a wizard of Oz prototype.
Gavin [0:17:18]: But the goal was the the human controlling it was the speaker recognition engine.
Gavin [0:17:24]: We wanted to take that out of the loop because if people get confused because it didn't recognize, we don't know if it's because of the engine or if it was the call flow.
Larry [0:17:35]: So at the end
Gavin [0:17:35]: of all this research, we'd we did things where we could even see participants kinda cover themselves up because there was something that we created like a siren.
Gavin [0:17:43]: All these real environmental cues and to see how they would respond to it.
Gavin [0:17:47]: I'm at the end of about a year late after the research was done.
Gavin [0:17:50]: I was asked to represent it.
Gavin [0:17:51]: So I did all the program of research, I flew out to Dc, and I said, oh, over lunch.
Gavin [0:17:56]: Who am I speaking to?
Gavin [0:17:57]: And they said, oh, it's just gonna be you myself, the president of the American Foundation from blind and Bonnie herself?
Gavin [0:18:04]: At the end of the talk, my sponsor handed the phone to bonnie, and she said, why don't you add someone to your phone book?
Gavin [0:18:13]: And at the end of that thirty second exercise, the phone said contact saved.
Gavin [0:18:19]: And Bonnie handed the phone back and said, and this is why I sued you.
Gavin [0:18:24]: And that it took less than a year to build that experience.
Gavin [0:18:28]: And for me, it was all about a prototype, rapidly understanding what are the things that people would say, not see and press, because it's a whole new world.
Gavin [0:18:41]: And think about how open Ai has become as a new world.
Gavin [0:18:46]: There are ways to prototype, which as part of the epic talk, we're gonna really talk about different ways we can insert prototyping into an Ai world to understand what would the human interaction be?
Gavin [0:19:01]: And are we helping to shape that design in a way that actually improve success?
Gavin [0:19:06]: Or is it about confusion?
Erin [0:19:10]: Awkward interruption.
Erin [0:19:11]: This episode of awkward silence like every episode of awkward silence is brought to you by user interviews.
Speaker_4 [0:19:17]: We know that finding participants for research is hard.
Speaker_4 [0:19:20]: User interviews is the fastest way to recruit targeted high quality participants any kind of research.
Speaker_4 [0:19:25]: We're not a testing platform.
Speaker_4 [0:19:27]: Instead, we're fully focused on making sure you can get just in time insights for your product development, business strategy, marketing and more.
Erin [0:19:35]: Go to user interviews dot com slash awkward to get your first three participants free.
Erin [0:19:40]: Yeah.
Erin [0:19:41]: And I'm so glad we actually went back because now we're gonna go forward, but at the time and even just listening to it now.
Erin [0:19:47]: You don't don't really think about researchers being innovative, and that here we are with what is the right prototype for this experience when this doesn't exist yet, and that's never been, more true than of course, it is right now as you think about what's the right prototype for an experience that doesn't exist yet for these Ai experiences.
Erin [0:20:06]: So Yeah.
Erin [0:20:07]: That's great.
Erin [0:20:08]: That's great.
Erin [0:20:09]: Well well, where should we go now?
Erin [0:20:11]: Where should we go in the future?
Gavin [0:20:12]: Maybe be.
Gavin [0:20:13]: Present.
Larry [0:20:14]: Yeah...
Larry [0:20:14]: Yeah.
Larry [0:20:14]: Press.
Larry [0:20:14]: I think Katie, what to me is really interesting and I was kinda like, eyes wide walking this a few weeks ago when we're were doing it, think about the study we wrote the epic paper from last summer, and the nature of the simulation then, the nature of the human role playing and they use of Ai to feed the human role playing?
Larry [0:20:37]: And then if can take it into a few weeks ago, where you, like, I am me on the side and you're like, oh my god.
Larry [0:20:44]: Are we doing multiplayer?
Larry [0:20:46]: Are we doing multiplayer Ll here?
Larry [0:20:48]: We just decided so I think that is an interesting nugget method methodology how we got there?
Katie [0:20:53]: Yeah.
Katie [0:20:53]: I think there's only so much of that We can really talk about here, but I can we can definitely allude to some of that stuff for sure.
Katie [0:21:00]: So, yeah.
Katie [0:21:02]: I think to me, one of the things that I keep revisiting is that we are at the dawn of a new era when it comes to how these things evolve over time and how people feel about them.
Katie [0:21:12]: Right?
Katie [0:21:12]: I mean, that's why you're seeing so many articles pop up, like, minute by minute in the times of people, like, having experiences that are either causing them to feel really alone or act on feeling really alone, Like, we're seeing consequences of personal use of these kinds of bots.
Katie [0:21:30]: We are just scratching the surface.
Katie [0:21:32]: Right?
Katie [0:21:33]: I mean, when people are using it to write their emails or things like that.
Katie [0:21:36]: It's a lot less risk.
Katie [0:21:37]: When you're using it for personal use cases stuff changes.
Katie [0:21:39]: And a lot of the work and domain that Larry and I have spent our time in is in wellness and families and personal use cases.
Katie [0:21:46]: Right?
Katie [0:21:47]: So, like, we are talking about putting a bot right in the middle of the Melee of family life.
Katie [0:21:52]: Right?
Katie [0:21:52]: Which is nuts as we know.
Katie [0:21:54]: It's nuts.
Katie [0:21:55]: There's a lot going on.
Katie [0:21:56]: And so I think one of the things that we are acutely aware of in the work that we're doing is the fact that even though we could literally publish things and put them in the hands of users and just step away.
Katie [0:22:07]: And in some cases, we do do that, really the thing that we really wanna unpack and into Gavin point about the Verizon story is where are we designing the experiment to give us the dependent variables that we're looking for.
Katie [0:22:20]: Right?
Katie [0:22:20]: Like, we're designing experiments with independent independent variables?
Katie [0:22:23]: And the nature of sarcastic products means there's a lot more independent variables than there were in the past.
Katie [0:22:29]: And so where can we make those variables dependent or at least kind of dampen the effects of the independent variables so that we can get really strong signal still to this day on the dependent variables that we're actually testing.
Katie [0:22:42]: So one example that Larry kind of was talking about they are alluding to before is that we know that people build relationships with Ll products.
Katie [0:22:51]: We know it.
Katie [0:22:51]: I mean, it's just a fact, and you can see it in these articles whether or not we wanna talk about it as a society or deal with it yet is a completely different question, but we know that there are relationships there.
Katie [0:23:02]: And we know that relationships are...
Katie [0:23:04]: They can be formed instantly or taint instantly.
Katie [0:23:06]: And so there's moments like that or independent variables like that that are worth examining and thinking about, okay.
Katie [0:23:13]: We definitely wanna preserve the s cast nature of a model and the randomness and the excitement of interacting with this novel technology but we don't wanna take the risk of it saying something inappropriate that would tank a budding relationship, particularly with multiple people.
Katie [0:23:28]: Right?
Katie [0:23:29]: Because if you think building a relationship with one person at a time is hard, imagine building it with multiple people and now imagine building it with multiple people who are in relationship with each other, and you are the inter as the Ll coming in, Right?
Katie [0:23:42]: To a family that has established behavior patterns.
Katie [0:23:45]: And so the nature of introducing a new novel technology that's meant to help into a rat nest of, like, evolved family mechanics is already messy.
Katie [0:23:57]: And if that thing, you know, although if Larry and I are are partners and raising kids together, and we have some deep seated issues, and all a sudden imagine the Ll takes my side and something, we have a problem.
Katie [0:24:08]: Right?
Katie [0:24:09]: So even in the work where we could ship something and give its people and study that behavioral data without the context of navigating the nuances of those relationships were not really understanding how L should not only show up, but learn from the family observe and show up over time.
Katie [0:24:26]: In a way, the L m's have to be almost sensitive as we researchers are when we are running interviews.
Katie [0:24:31]: Right?
Katie [0:24:32]: And navigating the complex dynamics of a family.
Katie [0:24:35]: And so in the world where we were simulating that a year ago, we were managing kind of ongoing relationships with L that are existing and out in the public sphere.
Katie [0:24:46]: So in other words, we are building a relationship between a family and say Gp.
Katie [0:24:50]: Okay.
Katie [0:24:51]: And we are managing and shut the responses back and forth as simulate tours of what Gb spits out for that family and then what the family actually sees.
Katie [0:25:01]: And that helps us curate exactly what we want to go over the wire to the family.
Katie [0:25:05]: Now a brief aside about that part of the story is that what was so incredible to me in that study is that when we switched out the researcher who was shut the responses back and forth, they noticed.
Katie [0:25:17]: The families noticed, they said things like who broke the L, something changed.
Katie [0:25:23]: They noticed.
Katie [0:25:24]: And we're talking about copy paste and curate a handful of text.
Katie [0:25:29]: Right?
Katie [0:25:30]: And people noticed.
Katie [0:25:31]: So this is how sensitive relationship building is, and it is wildly sensitive.
Katie [0:25:36]: And Larry and I've seen that before and other experiments that we've done.
Katie [0:25:39]: People anticipate that we had made changes to the model when we didn't, and they missed it when we did.
Katie [0:25:43]: So, like, there's just so much going on, and there's sensitivity in the human brain to detect these things, but it's not always one to one.
Katie [0:25:51]: Right?
Katie [0:25:52]: So or dealing with that.
Katie [0:25:53]: In the year, since we did that experiment that we're gonna talk about at epic.
Katie [0:25:57]: Literally, the tools themselves have evolved such that they can now take in multiple comments from multiple people at once and discern who's speaking and spit something out that's much more holistic as a response.
Katie [0:26:09]: Like the actual tool themselves have evolved, and therefore the nature of our simulation has evolved.
Katie [0:26:14]: And when I say that the tools have evolved, no one is saying that, we found that out because we literally experimented on the fly in a session to see if we could make it more effective, lower latency as a session and literally discovered that the tools had evolved enough to support this new design of experiments that we were running this year.
Katie [0:26:34]: So even our experimentation method isn't rigid at all.
Katie [0:26:37]: Like, in fact, we have to consistently experiment on our own experiment to make sure that are being thoughtful about the best most effective way to tweak those independent variables in the most precise meaningful way that we can given where the technology is today.
Katie [0:26:52]: It's a wild time to be conducting experiments for sure.
Larry [0:26:56]: And my guess is after that for me, I bet like, a lot of listeners like, me were going, wow.
Larry [0:27:01]: So somebody might be saying, what?
Larry [0:27:02]: So to decode just to some of that, respectfully, and I think we can talk about this because this is in the published paper.
Larry [0:27:09]: Right?
Larry [0:27:09]: That's what we're talking about K Helsinki.
Larry [0:27:11]: The method when Katie talks about the research to sort of acting as this filter for what we're pulling from the existing L from Gp or whatever, and then interacting with a family.
Larry [0:27:22]: The family, the participants who are consented, of courts, and everything.
Larry [0:27:26]: They believe that they are interacting with a prototype product.
Larry [0:27:30]: Let's say the prototype product is named, coffee cup or whatever whatever.
Larry [0:27:35]: So all coffee coffee is my L, but really what it is, is it someone like me or Katie or Gavin and supported by three people hitting, you know, Gp and everything else.
Larry [0:27:44]: Crafting responses, and this is where it gets interesting year to year.
Larry [0:27:48]: We're kind of as human researchers going, oh, I get this families spot.
Larry [0:27:52]: I've can spend some time with them.
Larry [0:27:54]: So I'm gonna take this Gp response, and I'm I'm gonna add, you know, my own emotional intelligence and my read of the room and filter that through the other variable all that's sort of interesting.
Larry [0:28:04]: This is a lot of the work that Katie and I did at web companies.
Larry [0:28:07]: We've always done stuff and this is Katie advocacy, you know, sort of to save our race for from the Dark ai overlords.
Larry [0:28:15]: That Katie has always maintain that, you know, humans made are gonna continue to add value and not just so you and I can have jobs, but that this is real.
Larry [0:28:23]: You know, we have a stake that you need a human in balloon when you discover the edge of the Ai technology competency.
Larry [0:28:30]: So what are the experiments that we did last year that we were this like, well, what kind of unit?
Larry [0:28:36]: So then we had other humans come in and and are they playing the role of a em patrick?
Larry [0:28:41]: Em fantastic peer?
Larry [0:28:43]: Or are they a credential expert?
Larry [0:28:45]: Do we want that the human in the loop helped to be proactively saw it.
Larry [0:28:49]: Should it be introduced.
Larry [0:28:50]: So notice are some of the variables that Katie was alluding to, you know, as an independent independent?
Larry [0:28:55]: And now this past year, I think what we started to see is that when we were conducting these multiplayer conversations with the family members, you know, parents teams, whatever, and we were managing conversations with the technology, we were just getting the width of that the technology was starting the Alm were starting to keep up.
Larry [0:29:20]: Like, let's say the four were family, the three of you were family, and I'm channel in responses going back and forth Gp t or now seems to know, oh, well, help Aaron this till gavin this until maybe that, and we're...
Larry [0:29:34]: What, Well, you can do that.
Larry [0:29:37]: So it's been in an amazing moment to moment.
Larry [0:29:41]: Discovery experience.
Katie [0:29:43]: Well, and just to build on one last thing that you said, Erin earlier, you talked about the idea that, like, sometimes research gets this kind of reputation of being a little bit reactive or, like, let's just observe and quietly study and, like, take copious notes.
Katie [0:29:58]: And, I mean, we are literally with research sitting on the cutting edge of product design because we are studying the future right now and in so doing are literally inventing the products that need to be there.
Katie [0:30:12]: And not only the products but even just like, interaction paradigm, and it's such a cool time for the practice.
Katie [0:30:19]: And I think there are a lot of researchers who feel very scared...
Katie [0:30:22]: And not just researchers, people in general who are very scared of this technology coming for our jobs.
Katie [0:30:27]: And the realities we all have to be curious enough to imagine that there's a version of our job that lives beyond the world in which these junior Ll bots that are junior researchers or junior accountants or junior recruiters or whatever are doing some of these more men tasks.
Katie [0:30:43]: Right?
Katie [0:30:44]: Like, and it's just there's so much to every one of our jobs that's actually so creative and acquired expertise and lived experience and all of these beautiful things that will continue to be very, very human.
Katie [0:30:54]: And in fact, more of what I do now is human than what I was doing two years ago, five years ago.
Katie [0:31:00]: And so it's really exciting if you can be curious about your job changing and have a growth mindset.
Katie [0:31:06]: I think it's a really cool time to kinda think about the practice actually being much more involved in product design product development than it has been in the past.
Erin [0:31:15]: Yeah.
Erin [0:31:15]: Absolutely.
Erin [0:31:15]: I keep having this macro question as you talk and tell me if this is something you've thought about, but you talking about this a bit before two Gavin of...
Erin [0:31:22]: Is this experience I'm interacting with human?
Erin [0:31:25]: Or is it technology?
Erin [0:31:26]: Does it have a name?
Erin [0:31:27]: Is the name coffee cup?
Erin [0:31:29]: Or is it Larry?
Erin [0:31:29]: I don't know.
Erin [0:31:30]: Is this something you all have thought about observed?
Erin [0:31:32]: Is that changing as obviously, the technology is changing, but so is our relationship to it?
Erin [0:31:38]: Yeah.
Larry [0:31:39]: Think what are the earliest things.
Larry [0:31:40]: You know, when Katie and I first started.
Larry [0:31:42]: And Ga deserves some credit here because in classic Ga style, like, after Argued maybe be, like, are you gonna do this thing with Katie and, you know?
Larry [0:31:48]: I think it might just be that get design it.
Larry [0:31:52]: And was like, only of those like guys, like, there's, like, Like like they said, Duke Gal could lead an orchestra, like, raising one figure.
Larry [0:31:58]: That's felt like Gavin and one of our other colleagues Honduras.
Larry [0:32:00]: Well sometimes like, say, one sentence to me and I'm like, oh, well, now I can write a ten page description.
Larry [0:32:05]: Thank for that.
Larry [0:32:06]: So what what Gavin said me to discover was you said they're not gonna know when you do this wizard Oz thing, whether they're interacting with a human or a bot.
Larry [0:32:16]: And that's Kate one of Katie's research question.
Larry [0:32:18]: So see have...
Larry [0:32:19]: And that was like a year before, you know, the Thanksgiving of catch Ep and everything.
Larry [0:32:24]: So it was fascinating people would be convinced it was a bot for all these reasons, or convinced it was a human for all these reasons So just something in response to.
Larry [0:32:35]: I think was part of your question.
Erin [0:32:36]: Yeah.
Erin [0:32:36]: No.
Larry [0:32:37]: For sure.
Erin [0:32:38]: For sure.
Gavin [0:32:38]: And I think we have to watch out for which one's winning.
Larry [0:32:42]: Like That's a lot of news
Gavin [0:32:44]: and it's it's honestly competition.
Gavin [0:32:46]: It's market dynamics.
Gavin [0:32:47]: People don't care.
Gavin [0:32:49]: People want as things evolved to be more agent.
Gavin [0:32:53]: Does it do what I want it to do.
Gavin [0:32:56]: Right.
Erin [0:32:56]: You want utility?
Gavin [0:32:58]: Utility.
Gavin [0:32:58]: It's...
Gavin [0:32:58]: We talked about the invisible computer, my god.
Gavin [0:33:01]: This thing in a perfect world would be invisible.
Gavin [0:33:03]: But it would be good enough.
Gavin [0:33:05]: So that it doesn't matter what engine is behind it.
Gavin [0:33:10]: Nobody really cares about how much, you know, Cpu it costs or what have your how, what the token limit might be.
Gavin [0:33:16]: Because the reality is what gets output is what's consumed.
Erin [0:33:22]: Mh.
Gavin [0:33:23]: And we know the output, it's got incredible grammar.
Gavin [0:33:26]: It's evolved to incredible style and personality.
Gavin [0:33:29]: And in honesty, even when it has reached a token limit, you would know because it gives an output that is so strong.
Gavin [0:33:38]: You believe it to be true even if it's at the fringe of its token limit, and it's kinda making a guess right now, but it doesn't like a human give you that, I'm guessing.
Gavin [0:33:51]: It's...
Gavin [0:33:52]: This is as important as it was, we first prompt.
Gavin [0:33:55]: So I think people ultimately, in the next five to ten years, I really hope they don't care.
Gavin [0:34:01]: Because I don't wanna be in a world where I go, which engine am I gonna toggle to?
Gavin [0:34:06]: Because this one is really good for these things, but it's not good for...
Gavin [0:34:10]: I just don't think people are are that aware.
Gavin [0:34:13]: We're not engineers.
Gavin [0:34:15]: We're just we're kinda lazy to be honest.
Gavin [0:34:18]: And and we just want it to work.
Katie [0:34:20]: I mean, I think when you're talking about the technology I agree with you.
Katie [0:34:23]: When you're talking about the edge case where they are going to want a human, I would say everything we have studied in the last five years, would tell me that there is going to still be this wall where it's, like...
Katie [0:34:35]: And we've seen it in, like, movies and, you know, the future casting type movies, even something like Westworld where, you know, someone gets rejected from a job in Westworld and He says, I'm sorry to ask this, but are you human?
Katie [0:34:45]: And it's obviously not.
Katie [0:34:47]: And I think there are still gonna be...
Katie [0:34:49]: If it's...
Katie [0:34:49]: When we get to the edge cases where a human does need intervene where someone feels like they're being lied to about being communicated with by a human?
Katie [0:34:57]: We're still gonna see that the betrayal dis trust, anger or frustration.
Katie [0:35:01]: I think to your point, Gavin, the beauty of where we will end up is a place where we can naturally move between the places where it is truly more efficient, more effective, more universally supportive to have a bot help you and where you've reached the edge and you actually do need a person.
Katie [0:35:18]: I mean, if you're dealing with some kind of bank crisis, and you're trapped in a doom loop with a bunch of robots that can't answer your question.
Katie [0:35:26]: You do really need a human.
Katie [0:35:28]: Right?
Katie [0:35:28]: And getting the identification that that is the problem, the appropriate triage and the escalation correct in something like call centers is gonna be really, really important because in the work that even that Larry and I have done, a lot of the times, it's literally going to be faster more effective and more, you know, meaningful for folks to have a bot do it.
Katie [0:35:48]: And then there are those edge cases where if you pretend that you are a human telling me that you understand what it's like to check your mom into hospice, and you're a bot, That's it.
Katie [0:35:59]: I'm out.
Katie [0:35:59]: I don't want anything to do with you ever again.
Katie [0:36:01]: You're not a human.
Katie [0:36:02]: You don't know what it's like to have a mom.
Katie [0:36:04]: You don't know what it's like to lose a parent, and I don't wanna hear it from you.
Katie [0:36:07]: I wanna hear it from another human.
Katie [0:36:08]: So, like, that part, I think we're still gonna see, but I agree the almost invisible machine is the endgame that I think everyone is excited about.
Katie [0:36:18]: Because the fact of the matter Larry and I've have seen this a company after company and study after study is that delegation sucks and that people don't want to have to delegate.
Katie [0:36:27]: People don't want have to delegate to human either.
Katie [0:36:29]: They talk about that when people say, like, why don't you hire an a assistant or get a babysitter, or get a house cleaner and they're like, it just takes too much work to tell them how to do it.
Katie [0:36:36]: Right?
Katie [0:36:37]: Right?
Katie [0:36:37]: Like, as we get to the point where we don't need as much delegation, the machine is going to be more effective, more efficient in most cases.
Erin [0:36:44]: What's the paradigm for, like, past assistant it's a what?
Erin [0:36:48]: It's air.
Erin [0:36:49]: It's how we getting Ai assistance and help when it's omni present and not delegated?
Larry [0:36:54]: I don't know.
Larry [0:36:54]: Is it gonna be, like, there's something on your skin to detecting.
Erin [0:36:58]: We don't know what the dominant Ui is going to be.
Erin [0:37:01]: Right?
Larry [0:37:02]: There's something in Katie's response to Gavin just now made me think too.
Larry [0:37:07]: And I think the example you gave, Katie, he's allude to it.
Larry [0:37:11]: Some of this is driven by I two, well, well, one is an axes are state.
Larry [0:37:17]: You know, the level of stakes of the task at hand and also domain because...
Larry [0:37:23]: And again, as soon as you get closer to anything involving someone's well being, they're going to pay more attention to the texture of the interaction.
Larry [0:37:37]: If I'm purely in an assistant model, let's say lower a assistance, so we're not talking about anyone one's product, in particular, if we're in that.
Larry [0:37:45]: Yeah.
Larry [0:37:45]: You know, for Philippine means a lot.
Larry [0:37:47]: We saw that lot in our research.
Larry [0:37:48]: I don't really care.
Larry [0:37:50]: Can it do it or not.
Larry [0:37:52]: What the bot can't pick up my dry cleaning.
Larry [0:37:54]: I am not interested in you and things like that?
Larry [0:37:56]: But then with the work in the domain, we're working in now, you see the nature of the interaction with the qualitative ability to hold a conversation you know, that goes beyond the ups s or sick.
Larry [0:38:14]: That's it.
Larry [0:38:15]: Mh.
Larry [0:38:15]: Was reading something about the sick nature of ai, and it's true, but, you know, this is sick fancy might get you through some business card exchanges and that's what you're about but it's not gonna help you form a real relationship with a family.
Larry [0:38:29]: So then I think we're also gonna see how once we're in a more relational, less transactional realm there may be a little more forgiveness too for the quality of the execution when people health or not are really beginning to depend on the relationships they develop for these out while.
Gavin [0:38:52]: And I think what we're all hearing about is not just an evolution of the technology, but just think about what we've just talked about heard a lot of transactions, but if the ask and the impact is higher than we start to get to the fringe cases.
Gavin [0:39:07]: All of this is describing a vast ecosystem that is based on transactions that we talk about getting right in a wizard of us.
Gavin [0:39:17]: Getting those right, but it's not just because we get to check the box.
Gavin [0:39:21]: We're building trust.
Gavin [0:39:23]: And humans filter.
Gavin [0:39:25]: We will filter what we think is gonna make us more successful because we're kinda lazy.
Gavin [0:39:31]: We're not gonna ask something we don't think it can do because I can't get that five seconds back.
Gavin [0:39:35]: So it's gonna be gradual it transactions that start to push the envelope, that build trust.
Gavin [0:39:43]: And then as that we're doing this while, the technology underneath it is changing.
Gavin [0:39:49]: And as you get closer to those friend's cases, how do we understand our own technology without assuming it, and that's research.
Gavin [0:40:00]: That is understanding where those fringe boundaries are so that we can anticipate and then start to work with engineering to say, look what we're seeing here here's sort of bad examples.
Gavin [0:40:11]: These are really good examples that pushed and supported that transaction because it's all input output, but it's really not.
Gavin [0:40:18]: It's really trust.
Gavin [0:40:19]: But wrote a book not the long called Ai and UX, and it's context, interaction and trust.
Gavin [0:40:25]: That's what we've discussed here.
Gavin [0:40:27]: People are revealing things.
Gavin [0:40:28]: That is the context.
Gavin [0:40:29]: How you interact or...
Gavin [0:40:31]: That's gonna be a lot that goes into whether I see this as good or bad or successful, but it's all building towards trust.
Gavin [0:40:38]: And if we have that mindset of Mvp, let's just get it out there and see what happens, that trust will not be assured.
Gavin [0:40:46]: And I think it's a lot of money being spent.
Gavin [0:40:48]: That could be wasted when you might have been really close?
Gavin [0:40:51]: Had you done research to understand those edge cases so you can build a better experience.
Erin [0:40:58]: Yeah.
Erin [0:40:58]: I think that's a great place to transition into maybe some practical advice for anyone listening.
Erin [0:41:03]: We've talked about, you know, some specific stories and examples of Lo f prototypes, but sort of parting thoughts for folks looking to bring this concept of a Lo f five prototype.
Erin [0:41:14]: Maybe new kinds of prototypes.
Erin [0:41:15]: To new technologies that don't exist yet, what's your best advice for how folks can think about that and their own work.
Katie [0:41:23]: From my standpoint, I think it's just get curious about the data.
Katie [0:41:26]: I mean, if you all you have if you're living in a world, let's assume you're living in a world because most people are right now where people are just like, let's just ship in and see what people do with it.
Katie [0:41:35]: Right?
Katie [0:41:35]: Like, I think getting curious.
Katie [0:41:36]: I can't tell you how much of my day is spent just getting curious about the data.
Katie [0:41:40]: Like, Oh, why did we see such low engagement on this particular thing that happened on this day.
Katie [0:41:47]: Instead of saying, like, oh, we had low engagement.
Katie [0:41:50]: That's it.
Katie [0:41:51]: Right?
Katie [0:41:51]: It's getting curious about the data?
Katie [0:41:53]: And then figuring out how to make the research fit that question.
Katie [0:41:56]: So in that case, if let's say you are in the world in your shipping and you don't have time investment money to do, like, a big true longitudinal study, which is, of course, how I would recommend doing this if you had infinite time and energy and and money and buy in and all these things.
Katie [0:42:11]: But, like, even in the world where you don't, and all you have is this data coming in that's curious.
Katie [0:42:15]: What you can do is intercept, what you can do is really surgical survey instruments.
Katie [0:42:21]: And, like, you can do straw pulls.
Katie [0:42:23]: You can get on the phone, like, you can go get the chat logs and figure out how to get the chat logs with privacy consent and all those kinds of things.
Katie [0:42:29]: Right?
Katie [0:42:30]: Like, an organization so that you can actually start to unpack why people are doing the things that they're doing.
Katie [0:42:34]: In fact, I just had a meeting with some researchers this morning where we talk about like, okay.
Katie [0:42:39]: If you're looking at just the data, there's gonna be some things that you can know factual actually know, there's gonna be a lot of stuff that becomes the new hypotheses.
Katie [0:42:47]: Around, like, oh, why did people do that thing?
Katie [0:42:50]: It looks like all of these conversations about this topic and very quickly.
Katie [0:42:54]: Why?
Katie [0:42:54]: Why is that?
Katie [0:42:55]: It's weird.
Katie [0:42:56]: It could be because the L is not responding the right way.
Katie [0:43:00]: It could be because we have a transactional instead of relational model.
Katie [0:43:02]: It could be because we have too many people chatting in the same space, whatever.
Katie [0:43:06]: But these hypotheses then become ways for us to get really curious about how to do the quickest dirtiest, you know, zero to one experiment to get in there and surgical to justify to understand.
Katie [0:43:18]: Larry and I...
Katie [0:43:19]: One other thing I'll just say before I'm sure Larry will build on tons of this.
Katie [0:43:22]: But even as recently as two weeks ago Larry and I were having a conversation about the subject of the practice, And I think we...
Katie [0:43:29]: As researchers perhaps did ourselves a disservice all this time making all these comments about ourselves being objective witnesses and all this kind of stuff.
Katie [0:43:37]: Right?
Katie [0:43:37]: There's a great deal of subject in the practice.
Katie [0:43:39]: And in fact, that subject activity is what makes it human.
Katie [0:43:42]: And it's what's going to endure is a lot of the subject in the design of experiments in the storytelling and all these kinds of things.
Katie [0:43:49]: And so one thing Larry and I were even talking about two weeks ago was as we design experiments, what's the most biased version of this experiment design.
Katie [0:43:57]: That we could possibly write.
Katie [0:43:59]: Well, so we writing a mod guide specifically for interviews.
Katie [0:44:01]: And we were writing the very, very biased versions of the questions.
Katie [0:44:04]: The most leading obscene questions that anyone would ever write.
Katie [0:44:09]: And then we back that into, like, how could we do this objectively as objectively as possible.
Katie [0:44:13]: Right?
Katie [0:44:14]: And so even just bringing subject to the practice at all points for the purpose of designing the experiment in the most meaningful way in a world where we have too many independent variables, Let's get really surgical about what it is we're trying to learn so that we can design the experiment to be as precise as possible and get an answer that we can at least trust enough to design the next experiment.
Gavin [0:44:37]: Brilliant.
Gavin [0:44:37]: Totally brilliant.
Gavin [0:44:39]: I love it.
Gavin [0:44:40]: Brilliant, mostly because it is surgical, and every diagnosis has a different treatment plan.
Gavin [0:44:46]: And let's be smart about it.
Gavin [0:44:49]: It's too easy to fall prey to, hey.
Gavin [0:44:51]: I wanna know if it asks this prompt does it get what I want.
Gavin [0:44:55]: You can fall into that trap and be very transactional, but there are a couple techniques Katie talked about that have a little bit of a longer tail, A little bit more of a relationship side And suddenly, you start to get that feedback where the participant says wait a second.
Gavin [0:45:09]: And that wait a second, that insight gives you signal that there's something deeper behind that one transaction but taken as a whole.
Gavin [0:45:19]: Here's how they're thinking.
Gavin [0:45:21]: Because remember, it's not just the technology.
Gavin [0:45:24]: It's the people who are using it?
Gavin [0:45:27]: They're also adapting and they're getting smarter.
Gavin [0:45:29]: What are they thinking in the next order thinking that you need to think about for your product.
Gavin [0:45:36]: It will only come with a couple really tried and true techniques.
Gavin [0:45:40]: It may require a surgical targeting based on what your hypotheses are, And so I don't know, can you talk about dependent measures and all these things, but she's trying to make this world a lot more precise into this black box that is Ai.
Gavin [0:45:54]: And the more you can do that, the more assured will be that this is the right direction.
Gavin [0:46:00]: And it's not just transaction A b and C, it's where those participants are to see the trend that you might not be seeing because we're so focused.
Larry [0:46:09]: I think that's right.
Larry [0:46:10]: And coming back to Aaron's question of what techniques would we give people.
Larry [0:46:14]: I think what Katie described and what Ga to respond to, I think the real techniques are meta techniques, there are modes of critical thinking.
Larry [0:46:23]: Because when I was listening to Katie and thinking yeah.
Larry [0:46:27]: That's what we do.
Larry [0:46:29]: And and I came from an commerce and marketing background that was deeply influenced by early work I did with Gavin, and I just made myself the user advocate every organization I was in.
Larry [0:46:40]: But the first Ceo e ever, I ever worked for, you know, said it's all about asking the right questions and look what I ended up doing in my green job.
Larry [0:46:48]: You you're, asking people questions.
Larry [0:46:49]: And what I heard him what Katie was saying, is you have to think about that ladder up and down with why and how because product managers are gonna bring you some very, very specific questions.
Larry [0:47:03]: Is that could well, It depends what questions.
Larry [0:47:05]: Because if they want to know all about a feature, I think a lot of researchers could do well with their best soft skills practicing high diplomacy to find out why that product manager wants to know about that feature.
Larry [0:47:20]: And then iterating why we'll get you to the root question, and Katie also talked about precision of what you're looking for to me that's having very clear research objectives for every grandma.
Larry [0:47:34]: And Katie and I fight, like, crazy people about that.
Larry [0:47:38]: On every project where we spoke, Like, what can we really learn you with?
Larry [0:47:43]: But this the green, here's what we're trying to learn here.
Larry [0:47:47]: Because the really deep and encompassing precision that Gavin was describing that has arrived at incrementally through asking you questions methodically and seeing where does it lead us next.
Larry [0:48:01]: The other thing that I heard on what Katie said it is something that I learned from a book called the inter game of music, thirty five years ago It comes to the inter game of tennis.
Larry [0:48:11]: And Katie was talking about how do we make the most subjective mod guide in the world so we know that's wrong.
Larry [0:48:17]: You know, a lot of musicians when they're performing, you're playing and you go, Oh my case a musicians too.
Larry [0:48:22]: Don't put her, you know, her lack of a backdrop fully.
Larry [0:48:25]: You you think, oh my god.
Larry [0:48:27]: I really sound like crap tonight.
Larry [0:48:29]: Not productive.
Larry [0:48:30]: Right?
Larry [0:48:31]: Just But if you know, okay.
Larry [0:48:33]: What's going on?
Larry [0:48:34]: Am I too left?
Larry [0:48:35]: Am I too soft?
Larry [0:48:37]: Am I too sharp, am I too flat.
Larry [0:48:39]: And just sort of breaking it down.
Larry [0:48:42]: And if you look at the complexity of the topics that were discussed, at least my little teacher has been asking those very simple questions and working with people you have to have a team that really you learn to trust and that will let you do things.
Larry [0:49:03]: Because, honestly, without glowing any smoke, the only reason that bold insight and that I and Katie have been able to do this for for five years is because Gavin has always created this environment first us, is like, Yeah.
Larry [0:49:16]: You know trust you got it.
Larry [0:49:18]: You guys are doing really interesting things here.
Larry [0:49:21]: And Katie and I have been able to do that for each other.
Larry [0:49:24]: So it may sound really K by, but to do groundbreaking research or whatever is we're talking about you need that sense of community.
Larry [0:49:33]: I find it really important, and I think it's looks contributed to our success for the three of us.
Katie [0:49:39]: I will add one last thing about working with bold insight from the client perspective is one of the beautiful things about my job and the job that I get to play with Larry in our day to day is that although I really enjoy being a researcher, and I'm I'm a good researcher, I'm a pretty good researcher.
Katie [0:49:55]: I would say, like, You know, I get to watch Larry actually do research now and, you know, I'd very rarely get to moderate anymore because he's so good at moderating.
Katie [0:50:03]: But the reality is that it allows me, and I think this is good advice for the researchers too to step into the shoes of the product manager.
Katie [0:50:10]: Right, which is a really fun place to be without all of the heartache of actually taking a job in real life.
Katie [0:50:18]: But to be able to be thinking about designing the experiment to get the answer to make the best decision we possibly can with the certainty level that we possibly can And to be working with a with an agency in bold insight that people ask me a lot, like, why do I go back to Bold insight.
Katie [0:50:31]: And the answer is that I get to play the product manager because I believe that they will hold the line on the research for me.
Katie [0:50:37]: If I push them to draw a conclusion, that I want to draw because we gotta make a call.
Katie [0:50:43]: Right?
Katie [0:50:43]: We gotta make a call.
Katie [0:50:44]: We gotta ship a a change, and that's the job.
Katie [0:50:46]: Right?
Katie [0:50:47]: I trust that Larry will let me walk right up to the edge of that conclusion.
Katie [0:50:50]: And if it is inaccurate or not subs, he will not let me do that.
Katie [0:50:54]: And it is a really beautiful place to be able to be a researcher and play the research game with Larry when I'm able to step into those shoes.
Katie [0:51:01]: And then two other days come back and, like, really fight to make a report, like, the one we're writing right now, actually land and have actual impact in a world where everybody is looking for places to cut.
Katie [0:51:13]: Right?
Katie [0:51:13]: Where research is delivering impact because I can step into the shoes of the product manager all the way up to our our leadership, and Larry can hold the line on the integrity of the research, which is a really beautiful kind of game us to play too.
Katie [0:51:26]: Yeah.
Erin [0:51:27]: I think the unintended theme of this episode has been trust, which is such a beautiful thing as we talk about Ai and trust among you all and the importance of trust between these technologies and the people on how that's ultimately gonna make it all successful.
Erin [0:51:41]: So that's been a a fun surprise for me and all of this.
Erin [0:51:45]: So thank you all for joining.
Erin [0:51:47]: Work and folks find you all online.
Erin [0:51:49]: Or not online or wherever he wants to be found or not found.
Larry [0:51:54]: While vince as well, bold site dot com, certainly for us, and we have linkedin profiles under our individual names of course.
Larry [0:52:03]: Katie, any other sort of channels for finding us?
Katie [0:52:08]: I have been quitting everything online.
Katie [0:52:10]: I'm only on Linkedin.
Katie [0:52:12]: You know, I used to be very active on, like Twitter and Instagram, and I really just, like, My motto for twenty twenty five was no more notifications.
Katie [0:52:19]: So I deleted all notifications from my phone for twenty twenty five, which has been a huge, like, freeing experience.
Katie [0:52:24]: So Linkedin is the primary way to get a hold of me these days.
Erin [0:52:28]: Awesome.
Erin [0:52:28]: Get her at the right time because she's not checking those notifications.
Erin [0:52:31]: That's right.
Katie [0:52:33]: Honestly, it's been the single best gift I've ever given myself because I am one of those people that I'm am an inbox zero kid and like the red dots on my phone.
Katie [0:52:41]: Were controlling my life.
Katie [0:52:42]: And so now there are no red dots on my phone.
Katie [0:52:44]: And the world hasn't collapsed.
Katie [0:52:46]: So it's been a really great gift.
Gavin [0:52:49]: But feel free to reach out.
Gavin [0:52:50]: Put some more red dots on her on her little inbox.
Gavin [0:52:53]: Because we're about to geek, and we love talking about applying what we learned into different areas.
Gavin [0:52:59]: Have people reach out.
Larry [0:53:01]: Hundred percent.
Gavin [0:53:02]: Again, bold insight dot com, Linkedin individually reach out.
Gavin [0:53:05]: I'm sure we can have a really interesting conversation.
Larry [0:53:08]: I think the email is hello bold insight dot com.
Larry [0:53:11]: Right?
Larry [0:53:11]: Yep They you to say, alright, rick Count and Katie.
Gavin [0:53:15]: And they can see on page and email us from there as well.
Erin [0:53:18]: Great.
Erin [0:53:18]: Yep.
Erin [0:53:18]: Thanks everybody.
Erin [0:53:19]: This is a lot of fun.
Erin [0:53:20]: Thanks for listening to awkward silence brought to you by user interviews.
Erin [0:53:30]: Be music by Fragile gang.
Erin [0:53:32]: Hi there, Awkward Silence is listener.
Erin [0:53:44]: Thanks for listening.
Erin [0:53:45]: If you like what you heard, we always appreciate a rating or review on your podcast app of choice.
Speaker_4 [0:53:52]: We'd also love to hear from you with feedback, guest topics or ideas so that we can improve your podcast listening experience.
Speaker_4 [0:53:58]: We're running a quick survey so you can share your thoughts on what you like about the show, which episodes you like best?
Speaker_4 [0:54:03]: Which subjects you like to hear more about, which stuff you're sick of and more just about you?
Speaker_4 [0:54:07]: The fans have kept us on the air for the past five years.
Erin [0:54:11]: We know surveys usually suck.
Erin [0:54:13]: See episode twenty one with Erica alcohol for more on that.
Erin [0:54:15]: But this one's quick and useful we promise.
Erin [0:54:18]: Thanks for helping us make this the best podcast it can be.
Erin [0:54:21]: You can find the survey link in the episode description of any episode or head on over to user interviews dot com slash awkward survey.