#156 - Change Management with Graham Gardner of U.S. Bank
Graham Gardner [00:00:00]:
Research jobs, in my opinion, is never going to be out of a job because as soon as you like put everything in place, something else changes. The right tool or ecosystem or way of doing things is really dependent on not just like the researchers or the leaders. It's kind of the political landscape, the legal landscape. Like what are customer needs and how do they change? Like there's so many kind of, you're kind of on a boat in the ocean trying to sail somewhere and you know, I think it's always going to be dynamic.
Erin May [00:00:29]:
Hey, this is Erin May.
Carol Guest [00:00:30]:
And this is Carol Guest.
Erin May [00:00:32]:
And this is Awkward Silences. Awkward Silences is brought to you by User Interviews, the fastest way to recruit targeted, high quality participants for any kind of research. Hello everybody and welcome back to Awkward Silences. Today we're here with Graham Gardner. Graham is the VP of UX Design Research Operations at U.S. bank. We're going to talk about change management and leading quite a large scope at a very large organization and what that's like and how he's thinking about building this out at US Bank. So Graham, thanks so much for joining us live from Chicago.
Graham Gardner [00:01:12]:
Yeah, thanks for having me everyone.
Erin May [00:01:14]:
Got Carol here too.
Carol Guest [00:01:15]:
Hey everyone. Yeah, really excited to talk about research and research ops at a big organization.
Graham Gardner [00:01:19]:
Yes, a very big organization.
Erin May [00:01:21]:
Very big. Unfathomable to us startup people. We're at like 1:30 now, so you probably have a few more than that.
Graham Gardner [00:01:27]:
Well, I think the trick with big orgs, it's. It's a lot of small and medium orgs just kind of combined in some ways.
Erin May [00:01:33]:
Yeah, for sure.
Graham Gardner [00:01:34]:
Enough pizzas to go around, right?
Erin May [00:01:37]:
Awesome. Okay, so you're the VP of UX Design Research Operations as we established. People are always really interested, I think in org structures. No right or wrong way to do it. Everyone's doing it a little bit differently. But maybe you could start by telling us, given your title and scope, how design research and research ops all sort of sit together in your org at U.S. bank.
Graham Gardner [00:01:56]:
Yeah, before getting to where I sit just thinking about the org structure. And you're right. Like every organization is kind of structured in different ways and there's really pros and cons and you kind of have to experiment to figure out what works for us. We have a combined marketing analytics and customer experience or cx. Org. So analytics and CX kind of combined and UX research also as part of the CX and market research kind of world. So lots of research all combined in one and then separate from that but still very collaborative. So it's more of like a matrix kind of structure in terms of it's not so much about hierarchy, but about who you collaborate with and who you partner with.
Graham Gardner [00:02:35]:
There's the kind of digital and experience design side of the org and they have the same structure in terms of pillars of business lines. So at a bank we kind of have a simplified version of our consumer or business banking and kind of corporate, commercial and institutional banking. And so we both have those structures and collaborate from a research and design standpoint. Complicated, but also you get the pros and cons of both structures with matrix. Just makes it a little more complicated. Complicated research operations is kind of interesting. I actually started about a year ago under one research group, so I was in a digital research experience group. And we learned pretty quickly that when you're thinking about research operations strategically, it's hard to do it in a silo.
Graham Gardner [00:03:21]:
Even if there's a lot that you can do in a silo, there's a lot of interesting levers you can pull kind of worldwide when you're thinking about strategic research operations. So I ended up shifting from being attached to one team to being part of the larger marketing analytics CX group, but in kind of a strategy side group. And so I can think about kind of strategic programs and projects across that entire group. Even though I specialize in research and research operations, the benefit is I also can learn about how other people are doing strategic kind of organizational transformation stuff in the entire org, or at least the entire marketing analytics org, and apply that in order to research and kind of be both broad and narrow at the same time.
Carol Guest [00:04:05]:
So this is unusual. You have a sort of research ops role within a broader the strategic organization you're describing. What are some examples of the types of projects that are happening within the organization? Maybe some that are research ops, but some that are different.
Graham Gardner [00:04:19]:
Yeah. So I think, and I also think about research operations as like, can be super high level. It's like what the research leaders are doing. It's. You're kind of making decisions about how you do research. And to me that's research operations. You're kind of translating like the strategy and the vision into the actual operations and the structure. So one thing they're thinking about is what is the future of AI and research.
Graham Gardner [00:04:43]:
And so an initiative that I'm thinking about is like, how do I learn from what's out in the world, experiment internally around AI and then kind of hold the ball or at least bring people together and facilitate some of those conversations and try to make sure that in a heavily like Matrix and complicated organization. Like, people are learning from each other too, and we can kind of not just go in different directions.
Carol Guest [00:05:07]:
So we'll want to jump more into AI, I'm sure.
Graham Gardner [00:05:09]:
Yeah.
Carol Guest [00:05:09]:
I'm just curious, are there other types of change management initiatives happening within this broader function that you're a part of? Like, tell me more about some of the other types of strategic projects, because I'm curious to hear how you learn from each other.
Graham Gardner [00:05:21]:
Process, you know, kind of restructuring is big across every kind of topic area. So, like, marketing process improvement is big when you're a bank, there's really important privacy concerns. And we have a lot of. US bank is known for being incredibly ethical as a bank. And so we have a lot of really important, but also more rigid rules around things. And so sometimes you have to think differently about, okay, how can we kind of redo this process in a way that's going to be more efficient, more effective, give people some autonomy, but also make sure that you're not just throwing away all that work and you're making sure you're maintaining the same level of like, high standards or on compliance. Marketing process improvement definitely kind of learning from how different people do that and also just kind of return to office is a big transition for a lot of people. I know many companies, including us, are doing multiple days a week, maybe not always five days a week, but learning from how do you think about change management and how do you think about the pros and cons of that and making the best out of it and doing it as a community?
Erin May [00:06:27]:
So tell us more about how you're thinking about, what do you want to change and what is your sort of charge in this new role that you're in at U.S. bank?
Graham Gardner [00:06:34]:
So I think my overarching kind of mandate, or maybe not or exciting goal, is thinking about evolving kind of the research, vendor and tool landscape and doing that in a way that it's not just me coming in with my playbook and saying, hey, here's all the best vendors, let's onboard them all, or here's all the best tools. But really trying to be the person that can almost maybe like project or program lead, facilitating bringing in different stakeholders internally, externally, so that we understand what the organization's needs are around research. How do we. What are our challenges? What are our opportunities? I'm almost kind of like a meta researcher, lowercase meta researcher in that standpoint, where I'm researching the researchers and researching the leaders and doing that kind of looking in research, as we called it, when I Worked at dido where you maybe you're doing a project and the goal is to build this, but why don't you learn about your capabilities as an organization to make sure that whatever you end up designing or doing is going to be successful. So I'm really trying to think of co creating the research landscape with everybody that I work with.
Carol Guest [00:07:41]:
So it does sound very much like.
Graham Gardner [00:07:42]:
A research project, a never ending constant research project. It's kind of like continual listening of researchers in the same way that a lot of companies are doing continual listening of their customers. Like I'm always just like constantly doing one on ones as researchers and catching up with them and seeing what their problems are and sharing ideas and getting feedback and it's yeah, just want.
Carol Guest [00:08:05]:
So how are you structuring it? So I mentioned a lot of our listeners are thinking about new tools they want to add or tools they don't want to use anymore. How do you think about understanding what your researchers need, what are the gaps, all that stuff.
Graham Gardner [00:08:16]:
Totally. So I think basic level is just similar to some user interviews tool maps, which I love. Every year when I get a new tool map I like to map out kind of the environment. So plenty of people are in their team at their area and they see what they're using. But just giving people a view of like what else is going on in the organization, tool vendor process, how things work in different verticals or horizontals or diagonals. I don't know if that's the thing yet, but giving people transparency and kind of visibility into the different structures and ways people are doing research and then using that as a means of starting conversation about like okay, what's working and not working for you, like what are your challenges? And playing back what I'm hearing from them and from other people so that it feels like we're kind of building these insights about our community landscape together and then taking that and doing my own evaluations of different vendors and different tools and experimenting. One thing I did this past year was test out a bunch of tools with a group of researchers where we actually did dummy data and different tools and tried it out so that it's not just me saying oh, here's the insights. I choose a.
Graham Gardner [00:09:28]:
But actually taking a group of researchers that represent different parts of the bank, bringing them along for the ride, giving them some guidance about what are we looking for, what are we not looking for, but also leaving it open ended enough. Some people bring like a 700 spreadsheet with like all the checkboxes they have to check box I'm a little less rigid than that. I'm really open to there might be opportunities with a tool or challenges with the tool that I don't even know whether or not we need to that assess. So bringing people along and helping them also or encouraging them to help shape the process too. I'm not the smartest person in every room and I love that. Like I, at the end of the day, I want kind of the design of an organization to feel like everybody's fingerprints are on it. And that also is a change management principle for me that like you're going to get better buy in if people feel like they were a part of the whole process. And it's not just leadership telling you here's what we're going to do because it's beneficial in these ways.
Erin May [00:10:25]:
Graham, have you had a lot of experience procuring tools in the past and this is like hard learned lessons or is it that this is a good way to manage change, whatever the type of change?
Graham Gardner [00:10:37]:
Yeah, I think a little bit of both. I certainly have been. I kind of call myself like an amateur technologist. I love technology. I think that also comes from just my personal interest in technology. But being around in an environment at IDEO where it was less like here's our small tool set, but more like we can do anything we want for any project as long as it suits the needs of the that client initiative and within reason. So I got exposure to a lot of different things and was constantly evaluating things in that same way because our procurement process was a lot looser than it is at the U.S. bank.
Graham Gardner [00:11:12]:
So exposure to lots of evaluation. But also I think from coming from a human centered design background where I think about kind of service design and system design and change management from a human centered design lens, I'm super influenced by like, okay, what are the human needs in a change process? And that's it's really influenced how I approach procurement too. It's not just about checking boxes, it's about kind of what's the end result and the people in the process are actually going to make a bigger impact. And you can't just put the perfect tool in a bunch of people's hands if they don't like it. It's more important to be good than like perfect and right. Like I'd rather everybody get their job done and be happy than be find the perfect tool.
Erin May [00:11:58]:
Right.
Graham Gardner [00:11:58]:
Yeah.
Erin May [00:11:58]:
And I guess on that point, while we're on this thread, you talked a little bit about the sort of continuous discovery process within this meta Research of stakeholders that you're doing. So you've procured a tool. You know, you've got some of your committee that was part of that buying process, hopefully using the tool or helping others in org use the tool. Is there then a process of kind of continuous monitoring of, I don't know, the implementation, the happiness with the tool. And are we thinking about, do we want to use this tool again? How do we make it more successful? Is that an ongoing process or is it more an annual process when renewals are up and that kind of thing?
Graham Gardner [00:12:31]:
Yeah, I think there's a really great phrase I've heard in a million corners. It's everything is a prototype, both from like research processes, but also piloting something. Like you don't onboard something, in my opinion, and say, like, okay, let's move on to the next thing. It's, this is a prototype. Let's see what barriers happen, what new opportunities come up while we're doing this. In some ways, it's a next phase of evaluation. It's, you can test things all you want, but the best test is a live test, like actually go and do it and see if it fails or not. And so to me, like, also, research ecosystems are going to be continually evolving.
Graham Gardner [00:13:07]:
So you can't just be like, okay, let's move on to the next thing. You're all like, research jobs, in my opinion, is never going to be out of a job because as soon as you like put everything in place, something else changes. The right tool or ecosystem or way of doing things is really dependent on not just like the researchers or the leaders. It's kind of the political landscape, the legal landscape, like what are customer needs and how do they change? Like there's so many kind of, you're kind of on a boat in the ocean trying to sail somewhere and you know, I think it's always going to be dynamic.
Carol Guest [00:13:37]:
I always love talking to researchers about the meta research process they do in their organization. I'm imagining your benchmark survey of how all the tools are going over time and all of that. This is jumping back a little bit, but just pure curiosity. You mentioned you're always assessing lots of different types of tools and the needs of the team. Are there any trends you're seeing now sort of tools that are emerging needs or. Yeah, anything like that around what tools are needed or interesting.
Graham Gardner [00:14:03]:
Yeah. So obviously again, AI everywhere and everything in most things. And I think that's, that's one of those things where it's like, okay, that's a major shift in Everything. And it kind of makes you see everything through a new light. Like it's maybe it's the rose colored glasses. For some people, AI feels like it's going to solve everyone's problems. You can go to the far other extreme and say AI is horrible and we should have human touch and everything. But I think also from a restructure perspective, I've seen the biggest benefits and the improvements I've had in environments be around.
Graham Gardner [00:14:37]:
Finding the right balance between automation or AI is just like another flavor of automation in my opinion, and human. So to me it's about figuring out, okay, what are humans good at and what are the robots good at and how can we work together and make sure that we have the right kind of guardrails and setting AI up for success or setting software up for success and letting it take away some of the things we don't want to do and give us more time and space for things that we do want to do and kind of live together in harmony. So AI is a big trend, other big trends. I think recruitment is something I've thought a lot about for many, many years and care a lot about. I see lots of different trends similar to the AI. Like the synthetic users conversation is really interesting. My gut reaction is I feel kind of skeptical of synthetic users. But from conversations that I've had with people in different industries, it seems like synthetic users can be good for some really generic kind of research, but it's not going to be the answer to everything.
Graham Gardner [00:15:37]:
So I think it's figuring out okay, when is it appropriate to use kind of simulated research responses from not real participants, if that is kind of backed up by real genuine, like good quality data about real people. But I think the industry as a whole has a data quality and participant quality problem. And so we see that across every even super high level expert network type sourcing, you're still going to have quality control problems. So staying ahead of that and creating both human and use AI to screen people out better. It's AI versus AI in some ways.
Erin May [00:16:16]:
I think intuitively with AI and to your point, faster, better version of automation, that it's figuring out how to use it, how not to use it, where to include the human in the loop, how do we actually do that? How are you doing that? Like how does that journey work? And I imagine there are particular challenges with of course, security and compliance when it comes to AI and how do you test into it given some of those things. But yeah, just curious where you are on that journey and figuring out where it Fits in within your research and where it doesn't and where you want to explore and don't want to explore yet and those sorts of things.
Graham Gardner [00:16:49]:
Yeah, absolutely. I think there's plenty of low risk kind of low hanging fruit things like AI in automated transcriptions, like super no brainer. Like as long as you're not putting the most sensitive data in the world out in an open area, you should be experimenting with that type of stuff. I think the far end of that spectrum is more on the synthetic users and kind of like generative AI for research is a little bit more like hey, let's see other people fail maybe especially if you're coming from highly regulated and more sensitive. Like I think financial services, health, any of the industries where there's potential harm to be done. If you're doing things wrong with real customer data and customer experiences, you have to be really kind of thoughtful about that and maybe let some of the consumer, maybe retail people should fail in that area before we do.
Erin May [00:17:42]:
It's a fun public service idea for researchers. It's like where you have the regulated industries and the non just sort of like could you guys fail for us and share your insights? I appreciate that.
Graham Gardner [00:17:54]:
Just pay attention to what other people are doing. It's so easy to get caught up in like your day to day work and like especially in a complex organization where like there's a million stakeholders zooming out and being a part of like the larger research and research communities, research ops communities. Learn from other people. I don't have to make my own mistakes.
Erin May [00:18:10]:
Awkward interruption. This episode of awkward silences, like every episode of awkward silences is brought to you by user interviews.
Carol Guest [00:18:17]:
We know that finding participants for research is hard. User interviews is the fastest way to recruit targeted, high quality participants for any kind of research. We're not a testing platform. Instead we're fully focused on making sure you can get just in time insights for your product development, business strategy, marketing and more.
Erin May [00:18:35]:
Go to userinterviews.com awkward to get your first three participants free. Well on that note, what are some of the communities that you're part of or that you find valuable or folks that you work with too?
Graham Gardner [00:18:46]:
Yeah, so the research shop's community in Slack is a big one. Even had a small cinch on the board there. But it's such a gigantic community, so many different needs. There's also, I don't know if it's a secret club or not. The Chacha Club is more of a slightly smaller, exclusive research ops community where you can kind of be A little more candid about what do you think about this and that and the other thing. So that's been really great. There's also design Ops. Assembly is really great.
Graham Gardner [00:19:12]:
I like to think about. Okay, beyond Research Ops, there's a lot of opsification of different disciplines that you can learn from. And I'd argue kind of the DevOps sales ops. Design Ops in some ways is more quote, unquote, mature than Research Ops. So learning kind of, it's my analogous research. If you learn from what other people are doing and apply it to your area, you're going to have a new perspective and it'll be beneficial. There's also really cool communities. Like, where are the black designers? Is great.
Graham Gardner [00:19:38]:
It's not exclusive to people who are black, but it's just like a great community that has more of an emphasis on people of color in the design and research role.
Carol Guest [00:19:46]:
While we're on the topic of research communities, you've been involved in research Ops communities for some time, including spending some time on the board of the Research Ops community. Any thoughts on how this has evolved in the years that you've been involved in the communities here?
Graham Gardner [00:19:58]:
Yeah, totally. I actually think back to that. There was an article, I want to say it was like 2018, Lucy Walsh, I think when she was working at Spotify, she wrote about her experience initially being in a participant recruitment type role and realizing, okay, there's so many things that I'm doing while recruiting participants that are repeatable and scalable and kind of I can learn from one project and apply it to another. And it really was like, oh, this is hitting me at a place where, like, that's what I was kind of doing at IDEO many years ago, where we were doing more like research kind of project by project stuff. And I felt like, wow, like, this is. This is what I'm doing. I'm actually. The benefit of working on a million things at once is that I can see like, oh, let me take what they're doing and kind of restart our way of doing it using that knowledge.
Graham Gardner [00:20:44]:
And so the really, really broad knowledge across projects made me think, okay, like, I need to think about the operation side of this. And I kind of shifted to being like, okay, I'm going to work on projects, to, hey, I'm going to kind of scale a team. And some people are going to work mostly on projects. And like, the deeper you go, the more you're going to work on, like, the internal stuff of, like, how can we kind of pull levers in our systems and our Processes and tools that are going to make a wider impact rather than just like, help one person at a time. And I see that as kind of a best transition to like, yeah, design research, being serving people and helping them to like, okay, this is kind of a service design kind of shift. And I also feel like the next level there is like, it's kind of like servant leadership in some ways. There's this, you know, I love Shirley Chisholm's great quotes around, like, service is the price you pay or the rent you pay for your, like, time here on Earth. And I think in some ways, like, I'm always like, I'm a helper, but I don't want to just like, give you what you want.
Graham Gardner [00:21:45]:
I want to not. And I also don't want to just teach you to fish. I want to reinvent how fishing happens. And so I think that's the trend of research ops as a community is kind of thinking about how you can do all of those things. There are plenty of people that do research ops and it's like hardcore project and program management and that's awesome. And there's other people that do research operations that are more like, hey, I'm a service and systems designer. I think that comes out really well in Kay to recent book Research that scales how research ops is showing up in different organizations. I think there's also another big trend around who's driving the car of production.
Graham Gardner [00:22:21]:
Is it the product manager is doing the research and taking people along for the ride? Is it kind of research or designers? Leading companies like Airbnb are super design led and that works for them. Other companies are super like product led and research is kind of following along. I think the question is like, okay, you're designing a table, who's going to sit there? It's going to be different for every company. And so how is research showing up? There's no Runway run one right way. But there's lots of great debate on LinkedIn and elsewhere about like, should researchers exist, Are product managers able to do research?
Erin May [00:22:55]:
It'll all shake out, but we'll see to your point. What I'm hearing from you, Graham, is across all these roles and even like the different roles within your current role, spending a lot of time thinking about systems, how they work. With US bank being an example of one very complicated system in a very complicated world. And not just how do we make what the work we're doing now better, but how do we actually change the nature of our work, get more done and kind of from that first Principles, perspective of how do we help the organization learn better. Right to your point of what should be the seat of researchers in the table. I don't know what's going to get the organization learning and shipping better products and building better systems.
Graham Gardner [00:23:37]:
Maybe we don't need a table. Maybe we need.
Erin May [00:23:38]:
Yeah, like what's a table? Totally. What else should we cover that's exciting happening at U.S. bank or that's top of mind for you, Graham?
Graham Gardner [00:23:46]:
Yeah, I mean there's so many, so many fun things. I think it's funny you mentioned earlier like complicated organization and yes, I say US bank is really complicated. But I also think small startups are complicated. Like you can have seven people and have really intricate, interesting personal dynamics and different goals. And I think part of the meta research that I like to do and I encourage all, even like researchers as they start to become more senior is like really do the meta research really understand all the stakeholders that are in your organization? What does success look like for them? Because at the end of the day research and researchers and research people, your end goal isn't just to create research insights and yeah, that's a large portion of it, but it's to make sure that your insights are being a really valuable and influential input into a decision process that helps your company or your product kind of be successful. And some people have this metric about oh, research needs to have this much impact on the end product. But I think one thing that we always have to remember is that if you look at that human centered design Venn diagram around what's desirable, what's feasible, what's viable, the insight about what's desirable and what people's challenges or opportunities are is one input. And it actually might not be the most important input at the end of the day.
Graham Gardner [00:25:13]:
Like it might not be the determining factor in the direction of the product. And I think people can be like, oh, nobody's listening to me as a researcher and that can feel really kind of demoralizing. But I think the more people get into the process and the more they become not just researchers of the end user or people as I like to call them sometimes that you're going to see like, okay, this is part of a bigger picture and I'm part of a collaborative kind of team or community. We're talking about the company's goals, not my goal as a researcher. Just to here's my insight. I hope it shows up on the front page of our company website. Like actually you're, you know, at the end of the day you're Trying to make people's lives better, hopefully in some big or small way. And there's a lot of different ways to do that.
Graham Gardner [00:26:00]:
And the more you understand how does my company deliver products and services, you're going to be able to be part of that process in a better way.
Carol Guest [00:26:09]:
I have one more question. It's a bit of a jump back before we get into our fun rapid fire section, but one thing I'm really fascinated about that you've said is sort of the some of the parallels between a research ops function and a sales ops, marketing ops. All these functions that have been separate. Right. Because they've evolved from the separate functions, but they can learn from each other two ways. One, I imagine there's a lot that these functions can learn from research. Right. The way you're approaching it, the human centeredness, et cetera.
Carol Guest [00:26:33]:
I'm curious also, what are some of the things that you noticed that we could learn from? Sales ops, marketing ops, and for a team, for someone who has those functions at their team. I don't know if you have thoughts on how to go learn that.
Graham Gardner [00:26:45]:
Yeah, totally. It's funny because when I was at you, we had an opportunity to. Because we were already using HubSpot for marketing. It's this whole, you know, CRM with plenty of tools. Instead of trying to go out and procure like a bespoke for research CRM or participate database, we ended up building our own within HubSpot. And so that really helped me immerse in this world of marketing best practices. And there's so much to learn about kind of differences between sales and marketing and communications and how do you do outreach to people, what's successful and then applying that to participant recruitment. And you think about the funnels of participant recruitment compared to marketing and sales, it's like, it's so, so, so close.
Graham Gardner [00:27:30]:
All you're doing is like, you're understanding, like, what do people want? How do I kind of find a match between what do I need and what do they need? And salespeople are so good at that. That's one of the things I love is in being in conversations with vendors when we're just chatting about, like, what are your needs? Like, they're doing research. They're doing really good research. And in some ways they understand some of the emotional keys. I always am like laughing. I'm like, oh, these people are laughing at my jokes. They have to. They want me to like them.
Graham Gardner [00:27:58]:
Yeah. I mean, it's, it's silly. Like the human kind of connection you make as A salesperson in trying to like understand who people are as, as a human being and the context of like what drives your decision making. There's lots we can learn from many different areas.
Erin May [00:28:14]:
It's interesting, a little bit of a tangent, but just, you know, we talk about how when researchers are gathering voice of customer from right across the org passive, part of that's getting insights from sales and careful, careful. They're biased on the sales side. But in a lot of ways everyone's biased. Everyone's biased. Of course. Yes. And also like sales is maybe the most incentivized to truly understand customers in a way. Right.
Erin May [00:28:38]:
So maybe it's a good bias. I don't know.
Graham Gardner [00:28:40]:
Yeah, totally. And I think especially within our organization where we have different levels of customer relationships, like if you're in an institutional client of ours, you're going to have a personal relationship with somebody and they're going to be your point of contact and you can think, okay, well there's actually some interesting complexities. Like, but then how do you do research with that client? Like do you create a separate relationship? Do you do research like in combination with that relationship? Like what should be the actual kind of primary relationship or like customer experience? And how does research fit into that? I think that's also a challenge that I see is that researchers will say think about, okay, let me create this whole entire separate infrastructure and way of communicating. And they don't think about all the other ways that other parts of the organization are communicating with customers or people that are prospective customers. So I think that's actually one of the benefits of our org structures. You can combine more of the people that are talking to customers. Customers and one and like try to make sure that you're creating kind of rules across the board. It's never going to be perfect.
Graham Gardner [00:29:45]:
But if you think about like a holistic customer experience and how does not just capital R research but like how are we learning about our customers needs and listening to them and taking all that insight and putting it hopefully together, you know, whether it's like cultural structures of sharing knowledge or like literal like repositories of knowledge. Another cool thing I see happening in different tools is like one example is like dovetail. Traditionally it's been just, you know, qualitative research. Throw your interviews, throw your transcript in here, maybe some surveys. You see that in Marvin as well. And then a lot of those vendors are turning into kind of, hey, why don't you also take in your kind of wish to the customer stuff in here. Why don't you bring in your call center feedback, let's analyze it all at the same place because we realize, okay, those things change, like in combination are a whole different color of understanding what people are thinking and feeling.
Erin May [00:30:41]:
Yeah, absolutely. Yeah. It's going to be really exciting to see how AI helps us in what it gets wrong. But ultimately I think right, there's always already positive signs of it being very helpful and making some sense of all that. Awesome. Well, I think we could probably talk to you for a very long time, Graham, but we'll move on to our rapid fire to wind things down. So what is your favorite research question? And not, you know, the question of which the research is trying to get a business answer, but to ask in a user interview?
Graham Gardner [00:31:09]:
Yes, I love the question. And whether it's on remote or in person, it's. Is there something in your space that you want to just grab that you think tells like either interesting story about yourself or your environment? I'm a video on person mostly because I'm nosy. So the more I can learn about your environment, I think it's like it opens people up to telling interesting stories from themselves.
Erin May [00:31:30]:
I like that a lot.
Carol Guest [00:31:31]:
Okay, I have to ask though, a follow up. What do you tend to learn from that? What do you, I don't know. Curious about this question.
Graham Gardner [00:31:37]:
Yeah, it's funny, I think you learn about, I think, people's sensibilities or maybe their environment constraints or opportunities. Like, I can tell a lot from somebody's background. Like I might go a little FBI stalker on people in some ways, but I'm like, I can tell, like I could probably give you your income, your age from your environment. And so the more inputs you have, you're like, okay, I can learn about maybe your background, but also what's important to you. Like what are your values? Like the choice that you're making of highlighting this. Sometimes it's just like somebody grabs the stapler from their desk and they're like, this is my stapler. And it's really like what you're learning is like they don't want to engage with that type of like they're shy. Or maybe they're just not invested in this conversation.
Erin May [00:32:21]:
Right.
Graham Gardner [00:32:22]:
No. Information is information too.
Erin May [00:32:23]:
Well, Graham, now I just want to spend a little while hearing you tear me down. That would be fun. But I think we have to move on.
Graham Gardner [00:32:32]:
My secret fun fact at a party now is I applied to be an FBI special agent and I got pretty far in the process. And I was like, wow, like a lot of my Skills are kind of applicable, but maybe not the best fit for me. I ended up failing the physical fitness test because I have tiny arms. But don't tell anyone.
Erin May [00:32:49]:
I love it.
Carol Guest [00:32:50]:
Wow, we're learning a lot in this last, last section. I love what are two to three books or resources that you recommend both to others?
Graham Gardner [00:32:56]:
Yeah, I mean the book that I recommend to everybody, even, you know, randomly meet people on the street. Erica Hall's Just Enough Research. It answers all of the questions around, you know, looking in, doing the meta research and why and why you shouldn't kind of go a hundred layers deep before you realize where you're going. So yeah, that's my number one.
Erin May [00:33:16]:
Nice. We should do some meta analysis on our questions at some point. It's a good idea, blogging a note. That's probably the most recommended, I think. And it's good for beginners too, right? But it's good for everybody.
Graham Gardner [00:33:29]:
Yeah, absolutely. And one speaker or writer that I've been really interested in lately around AI is. I'm going to get her name wrong. Is it Mary Gray? I think she's from Harvard and or Microsoft. Really, really insightful around that relationship between humans and AI and highly recommend doing a Google and listen to her talk.
Erin May [00:33:51]:
Yeah, this seems to be a person. Yes, yes, Mary Gray. We'll link it in the show notes. Last one. Where can folks follow you? Are you active on LinkedIn or elsewhere, engaging with folks?
Graham Gardner [00:34:02]:
Yeah, definitely LinkedIn. Join any or all of those research ops communities and chat with me on there. I'm definitely very slack. Active. Love to just get that little tiny serotonin. Serotonin boost from helping people.
Erin May [00:34:15]:
Yeah, from clearing the red notifications for me. That too.
Graham Gardner [00:34:20]:
You can delete them too.
Erin May [00:34:22]:
Yeah. Awesome. Well, Graham, thanks so much for joining us today and working through the tech hurdles. We always appreciate that and happy holidays and thanks so much.
Graham Gardner [00:34:33]:
Super fun to chat with you all.
Erin May [00:34:40]:
Thanks for listening. Listening to Awkward Silences brought to you by User Interviews Theme music by Fragile Gang Hey there Awkward Silences listener. Thanks for listening. If you like what you heard, we always appreciate a rating or review on your podcast app of choice.
Carol Guest [00:35:05]:
We'd also love to hear from you with feedback, guest topics or ideas so that we can improve your podcast listening experience. We're running a quick survey so you can share your thoughts on what you like about the show, which episodes you like best, which subjects you'd like to hear more about, which stuff you're sick of, and more just about you. The fans that have kept us on the air for the past five years.
Erin May [00:35:24]:
We know surveys usually suck. See episode 21 with Erica hall for more on that. But this one's quick and useful, we promise. Thanks for helping us make this the best podcast it can be. You can find the survey link in the episode description of any episode, or head on over to userinterviews.com/awkwardsurvey.