#142 - Quantifying Research Impact with Ruby Pryor of Rex
Erin May [00:00:03]:
Hey, this is Erin May.
Carol Guest [00:00:04]:
And this is Carol Guest.
Erin May [00:00:05]:
And this is awkward silences. Awkward Silences is brought to you by user interviews, the fastest way to recruit targeted, high quality participants for any kind of research. All right, we've got a much end dissipated, long awaited episode here with Ruby Pryor, all about research impact. I just know everyone's thinking about this right now, so I think folks are going to get a lot out of this episode at any level of research, any kind of organization. Just some great practical tips to think about research impact qualitatively and of course, quantitatively, too.
Carol Guest [00:00:46]:
Yeah, absolutely loved hearing about. You'll hear all about impact at different levels of the organization, qual and quant, working with stakeholders. So definitely top of mind. And I learned a ton. Check it out.
Erin May [00:00:56]:
Hello, everybody, and welcome back to Awkward Silences. Today we're here with Ruby Pryor. Ruby is the founder at Rex. Rex is a UXR and strategic design consulting firm. Today we're going to talk about something that I think is on just about everybody's mind. It's certainly on mine. I know it's on yours. We're going to be talking about impact and how do you measure the impact of research? So giddy up.
Erin May [00:01:22]:
This is going to be chock full of lots of interesting insights for everybody. So really excited to dig in. Thanks for being here with us.
Ruby Pryor [00:01:29]:
Really happy to be here.
Erin May [00:01:30]:
We've got Carol here, too. Hello, everyone.
Carol Guest [00:01:33]:
Very excited to dig into this topic. Definitely top of mind in UXR, of course, and then as a product manager in the world, it's always close to my heart.
Erin May [00:01:41]:
Awesome. All right, so 2023 was a big year for research. We saw a lot of change. We saw some layoffs, and we saw a lot happening in the market. The market wasn't going so great in a lot of industries. And so I think that resulted in a lot of people starting to ask important questions that maybe we should have been asking before, and certainly many people were, but really came to the forefront. So you're an expert on this, on how to think about the impact of research, and I think maybe a good place to start would be to think about what questions should researchers be asking to think about their impact qualitatively. So we'll get to the quant, but how do you think about the impact of research in your organization?
Ruby Pryor [00:02:28]:
Yeah, I think this is a really great question, and I just totally echo your comments around. It's never been more important to have this discussion than it is right now. The market is still going through so much flux I don't think 2024 has really calmed down yet. So I think this conversation is still really critical for me. I think we need some kind of a framework to tackle when we think about the impact that UX research makes. And when I started kind of looking around online for these type of frameworks, a lot of them, I would say, were more focused on ux, rather than UX research specifically. And so I thought about how we can kind of conceptualize of the different levels of impact that I've seen my work make over the course of my career. And I came up with this kind of like four level framework to think through this impact.
Ruby Pryor [00:03:14]:
So I think the most foundational layer for any UX researcher is at that insights layer. Like, are you producing new and critical insights about your users for your organization? That's absolutely table stakes, because if we're not doing that, it's very difficult for us to have any other kind of impact. We are really the custodians in a lot of cases of creating, producing these new insights about our users. So I think that's like the first level of impact we should seek to make. Are we moving forward the organizational understanding of our users one level up? From that, I think we can consider optimization. So the products that we already have, how are we making those better? What are the ways in which we're tweaking those products, improving them, to see that what's already there is performing better than it was before? One level up from that, I think, is about prioritization at that product and the feature level. So are we helping to shape decisions around what an organization chooses to build or not build, or a feature to prioritize or not prioritize? And this layer, I think, is really important to start having more of that strategic impact, because if you're influencing the choices an organization is making at that product level, that's a pretty strategic impact from your research. But I think there's even one level greater than that.
Ruby Pryor [00:04:32]:
So I think the apex of the kind of influence you can have as a UX researcher, or the impact you can have as a UX researcher is at that strategy level. And by strategy, I tend to mean things outside of product strategy. So business strategy, corporate strategy, these kind of strategic questions that are not just about this single product or this single flow, but more about this organization. Who are we? Where are we headed? What markets do we play in? What problems do we solve? Where are we credible? What does our brand mean? These are the really big and meaty strategic questions that UX research can absolutely have a role to play in and I think that's almost the highest level of impact that we can have.
Erin May [00:05:12]:
Awesome. And these pyramids of sort of maturity and moving up pyramids. Strategy is often at the top, trying to think of what goes above strategy. Somebody will think of something. So that's really helpful. So we've got this four piece pyramid, we've got insights. Are you generating insights? I always call insights, sort of like the currency of research. Like this is the thing, you actually produce insights.
Erin May [00:05:34]:
So did you produce any of them? Just output. What's your output? Optimization. Have you made something that exists better, not something new? But we have a product, we have stuff, we've made it better. And ways you can probably measure some of those. Then we talk about actually informing the product roadmap. So actually creating new things and influencing what gets created, what we're spending our time on, our investments, and then we get to that strategic layer. So not just product, but the entire business.
Ruby Pryor [00:06:01]:
Yeah, I think that's right. And I think there was anything higher than strategy, it would be like vision, which is about this real big picture stuff of where the organization is going, which also UX research complain.
Carol Guest [00:06:12]:
And it sounds like you're saying as you move up this pyramid, you have more and more impact. Certainly resonates. I assume that doesn't mean that every UX researcher should be assuming that they work on strategy work. I think there's probably only so many strategy projects. So I guess I wonder one, is that right? Like do we assume there's still a distribution? Or should UX researchers really be advocating to focus on strategy even if they have previously maybe been focused on optimization within their organization?
Ruby Pryor [00:06:38]:
I think that's a really great question. I think the answer for me is like, as a team, I would really love to see UX research present at all of these levels, but that doesn't necessarily mean that every individual inside of that team needs to be present at every level. The reality is strategy in particular, if we're actually talking about things like business strategy and corporate strategy, is very much revered across an organization. Like this is where I spent probably the majority of my career. I'm an ex management consultant. I spent two years at BCG as well. And working on strategy is something everybody wants. Your data and analytics folks want it, your sales folks want it like your product managers want it.
Ruby Pryor [00:07:17]:
Everyone wants to get up into that strategy level. So it's a really hotly contested space. And UX research, alongside all of those other disciplines, needs to kind of get a seat at the table, so to speak. And it's not as simple as just saying like, we're really valuable. Pick us. It's quite a contested space to get into, and there are so many folks in an organization that really want to play at that level. I do think we have a unique perspective and unique insight that we can bring in terms of that deep qualitative understanding of customers and markets. But working at that strategy level is challenging, right? To get up there and you add in that other element that often organizations will bring in external consultants to do that work as well.
Ruby Pryor [00:07:55]:
And so then if you're internal, how do you collaborate with folks like that? There are challenges to kind of ascending up that pyramid. I would also say that to be a highly specialized researcher working at that optimization level I think is incredibly valuable. If you are an absolute expert at doing usability testing and eye tracking and these types of disciplines, and really can help teams optimize their products, that's invaluable. And if you can work with data on top of your qualitative expertise, I think teams would be just like chomping at a bit to have you as a member. So I think picking an area and specializing is also not really a problem in terms of career growth. As long as the organization recognizes that you truly are an expert, I think it's fine to kind of pick and choose what makes sense for your own career.
Carol Guest [00:08:43]:
I think a lot of the stories we hear about the sort of like highest immediate impact from UXR tends to be at that optimization level, like a copy change that all of a sudden had a user interact in a very different way with a product. So very sort of the quick, quantifiable impact that maybe was sort of relatively cheap to build. I'm curious, going back to the strategy level, how you talked about UXR getting a seat at the table and how many people want seats at that table. Do you have thoughts on how to get UXR at that table?
Ruby Pryor [00:09:14]:
Yeah, I think it's really challenging and it often can speak back to organizational structures like how many layers away are you from the teams that are working on strategy? And you have to first step is usually to unpack, well, how does strategy actually get made inside of my organization? Like who is involved in it? What are the cadences of forming strategy? What teams are forming strategy? In a large enough organization, generally you'll have a corporate strategy team, smaller organizations, it might be like the chief of staff and the CEO might just be working together to form at least the first cut of the strategy before it gets more widely disseminated. Every organization is different. Every organization might have its own quirks of, for some reason, historically, X team has been very involved in strategy, and that's just kind of on you to unpick that and figure out who's who. But I think that's the first thing you've got to figure out who's involved in strategy development and how are they actually doing it. And then I think if you can form relationships with those folks, explain your discipline and explain the ways that you can be involved, that can be really helpful, and that might look like taking more of a facilitation role rather than taking a really active research role. I think a lot of UX researchers are brilliant facilitators. And if you can say, hey, I know you have your quarterly strategy workshop, here's some ideas for how I would run that, or how I've run that in the past. What do you reckon? Do you think we could collaborate on this new format to think about strategy development? I think ways in like that can be really fruitful.
Ruby Pryor [00:10:45]:
Often folks are happy to have something taken off their plate as long as you set yourself up as a true collaborator. So I think there are some ways that we can start having that conversation.
Erin May [00:10:54]:
So our audience, we've got folks who are early in their careers, very senior and large organizations, large research teams, small. So everything in between. So depending on where you are in the research team, in the organization, it sounds like you might want to do kind of an audit on where you are in this pyramid. And if I'm hearing you right, they really all kind of build on each other. So maybe we shouldn't be thinking about strategy if we aren't generating any insights. Is that the right way to kind of think about it, like making sure that we're moving up one step at a time?
Ruby Pryor [00:11:29]:
That's a really good question. I think sort of yes and no. I agree with you that in terms of insights is the currency. And I think there have been moments, to be really honest in my UX career where I've thought I could leave behind the research and just do everything else. I could spend all of my time just talking to PMS and just talking to designers and just doing that implementation, change management, prioritization, like all of those other things that I think a lot of UX researchers do. But the reality is, if we're not producing the insights, then who's going to? And the markets change all the time, our products change all the time. There are so many things that we still need to be doing. So, I do think insights is evergreen and has to be.
Ruby Pryor [00:12:11]:
Otherwise, what's the unique contribution that a UX researcher is making? So I think that does have to be evergreen. And then I think so much of it depends on what's your team structure and how does it work. If you're embedded in a product team, then I think you do need to be doing optimization, prioritization kind of all at once together. But if you're either in an agency consulting back to clients, or if you have more of an agency set up within your organization where you kind of get deployed on project by project by project, I think then it would be pretty conceivable to have projects that sit just at the strategy level for quite some time, or just more at that prioritization level for quite some time. So I think it depends.
Erin May [00:12:51]:
Yeah, that makes sense. So we talked a little bit about the strategic layer and how to be a researcher, really, and see what's going on, who is setting strategy, how might I become part of that just based on the context of what's happening? What about when it comes to some of these other layers and how to be part of those or make sure those are happening? So, for example, we talked a little bit about informing the product roadmap or what products get built. What are some good ways to be part of that and make sure that's happening?
Ruby Pryor [00:13:20]:
Yeah, I do just want to make one more comment on being involved in strategy, and I think sometimes we can be reticent to go down this road because it's a tough answer. But think about what type of organizations are more likely to facilitate the type of work that you want to do. Almost the majority of my work in UXR has been at that strategy level because I came from organizations that did strategy. So if you're listening to this conversation and think, I really want to only work at that strategic level, then have a look at joining a consulting firm because you're much more likely to do that kind of work. If you only really want to work at that optimization level, then you might want to go to one of these big organizations that's all about relentless a b testing and iterating, iterating. Iterating has these really short feedback cycles. So I think you have to understand the dna of different organizations that exist out there and think if I have a really strong sense of how I want to shape my career, that career is going to be much easier to execute in certain organizations rather than others. And that's tough because I think a lot of people want to think I could have any career that I want inside of any organization, but that can be an uphill battle sometimes.
Ruby Pryor [00:14:25]:
So I think that's another element to consider. If you really have a strong point of view about the type of work that you want to do, try and find an organization that's much more likely to match that.
Erin May [00:14:35]:
So then going back to, yeah, that's a great point. Yeah. If you're earlier in your career and you don't know. Right. Which of these might appeal to you long term, is there a type of organization you might want to seek out that will give you exposure to a larger variety?
Ruby Pryor [00:14:50]:
Good question. My first thought was a big organization because I do lots of types of work, and then my immediate contradiction thought was a startup.
Erin May [00:14:58]:
Right. You can kind of ask as you're interviewing and just get a sense of what kind of research they're doing or want to be doing, and that might give you some insight.
Ruby Pryor [00:15:08]:
Yeah, I think early in your career, having people to learn from in your discipline is really, really helpful. Like for your very first job, I think in industry, I think it would be tough to be kind of the sole researcher or kind of the sole practitioner in your field. I think that mentorship is really important early. I think there's a nice in between portion where it might feel so exciting, liberating to be thrown in and have free rein to be the sole researcher or to be the person that kind of sets up all of these things. But I think early on, having that mentorship can be really important, I think for teaching you different elements of the craft.
Carol Guest [00:15:42]:
Just pulling on this thread a little bit more, I wonder if you're researcher who's thinking about a new job and they're maybe assessing different companies for where they think that researchers fit in that pyramid typically. Or I wonder, are there questions you might ask or things you might look at as you're assessing an organization to understand where research typically lies.
Ruby Pryor [00:16:03]:
So I think a technique that's underutilized in job interviews is to ask behavioral based questions back to your interviewer. So often when you have an interview, they ask you these very specific questions where they want an answer anchored around a scenario. So tell me about a time when you had to work as a member of a team to achieve a common goal, something like this. And I think what's really interesting is if we can ask these type of specific questions back to an organization. Tell me about a time when you changed your strategy, your product roadmap, or optimized a product off the back of ux research. Share with me as much detail as you can. And I think if an organization can't do this, that can be a bit of an indication that might not be somewhere that values research in the way that you want. And I think if they answer with a really strategic kind of answer, follow up and ask for something at the roadmap level or at the optimization level to just give you a sense of how this organization feels kind of around this area.
Erin May [00:17:04]:
So Ruby, at these different layers of sort of impact and influence, they have a different size and speed of impact. Right. So when we talk about optimization, often we're talking about, on a relative basis, smaller impact, but you can stack them because you can do lots of them. And the feedback cycle is quite fast, as opposed to on the big strategy, we're talking potentially years out. It might be years before you see that impact, but it might be huge, it might be game changing impact. Right. And so that's kind of the trade offs there. When you think about the pros and cons of these and where you might fit and want to specialize, how do you think about sort of habits along the way that are going to make you impactful as a research, whatever kind of research you're doing? I know you talk a lot about, for example, just building great relationships with stakeholders.
Erin May [00:17:55]:
Right? So are there things like that that can just ensure that you're going to be impactful and that that impact is going to be seen and understood by the organization?
Ruby Pryor [00:18:04]:
Yes, I think this is a fantastic question and you're completely right. So stakeholder relationships is critical. I think another one is record keeping, which sounds very maybe unexciting. But if we're going into changes off the back of a piece of research, and we think we're going to move a certain set of metrics, we better write down what those are and what they are at baseline. So I think keeping good records of how things are before intervention and after intervention is really helpful. And specifically at the end of a research project, writing out what are the changes that teams have committed to make off the back of your research. That's, I think the first thing that we can clearly kind of draw the line between us as researchers and impact is just teams actually committing to do something different. And then there's the impact that actually happens once that product is live and in market.
Ruby Pryor [00:18:53]:
But that itself might take some time. Each product has its own development cycles. They could be long, they could be short, they could be days, they could be months. So it can take some, quite some time to even follow up with that. So I think the first step is to keep a record of the team committed to change x, Y and z off the back of the research, and then just to really stay in touch with that product team or whoever is responsible for tracking those metrics, be it data science, pm, whoever, so that you get a sense of what actually happened once that launched in market, because they're the golden metrics. If you can actually say that our conversion rate increased by 28% or whatever it is, that's irrefutable magic. But I still think that step before, which is the team agreed to adopt every recommendation for my research report or to change these three things, that's still a really important impact that you can track. So I would document it somewhere and then share it wherever you can.
Ruby Pryor [00:19:46]:
So you should be sharing it in your performance review. You should be sharing it when you meet new stakeholders to explain the value of UX research. You should be sharing it with other members of your team to show the impact that you've been making. We really shouldn't be shy about sharing the amazing impact we're having within our organization.
Carol Guest [00:20:03]:
And you've posted about going back to these different levels. You've posted about how showcasing impact can look very different based on those levels. So we talked earlier about those little optimization projects that have a clear, measurable outcome right away, whereas maybe those strategic projects longer turnaround, harder to see the impact. How do you think about showing impact.
Erin May [00:20:21]:
At those different levels?
Ruby Pryor [00:20:22]:
Yeah, I think this is a really important layer to kind of add to this discussion that if we go down this road of creating metrics, particularly quantifiable metrics, off the back of our research, it will tend to push us towards doing more of this optimization work because it's easier to quantify. And the challenge then can be for growth of the team or growth of the individual is more senior stakeholders can say, that's all well and good, but where's your impact at a strategic level? The flip side is if you go, oh, we've been doing all of this work at a strategic level, then you can hear feedback which sounds like, oh, this is all well and good, but I can't see the impact yet. You go, oh, okay, well, I'll go back down to the optimization level so I can show you this impact. And then you hear, well, that's not strategic enough. And it can be this really kind of frustrating cycle that someone can get stuck in. And I think recording that kind of impact at the strategic level is more difficult. And I think it's more difficult for every discipline. This is not something that's unique to UX research, because generally, I would say how one measures the impact of the strategy is generally a strategy is like a vision and a plan like this is where we want to go and these are the steps we're going to take to get there.
Ruby Pryor [00:21:33]:
So we can't, I don't think, I don't think we can really come up with a set of measures for strategy in the abstract because every strategy is going to be really different and the way that it kind of layers down into impact is going to be really different. So only once that strategy has really been set can we actually develop metrics off the back of it. And those metrics are informed by each of those initiatives. So, so many things have to go right before we can see the impact from strategy, many of which are going to be outside the control of your team as well. So then how do we show that impact? So I think one thing that you can show is, again, like the activities that you're involved in. So facilitated a workshop with ten leaders at the executive level to set the vision for 2026. I think that already shows that you were in the right rooms, having the right discussions, even though that's not, and then the firm entered a new market or whatever it is, because those kind of things take time. Another, I think, really good indicator that you can show is in any strategy documentation.
Ruby Pryor [00:22:32]:
Where are the fingerprints of Ux research? So are there statistics that came out of surveys that we ran? Are there quotations that came out of our qualitative research, the synthesis of our qualitative insights? Those things, I think also can be really good evidence. So you can say was a key contributor to strategy document, see page seven. And you can be that explicit. Like, look, here is my impact right here. And I've done this in my career when I've seen, for example, that qualitative research referenced in strategy documents or roadmap documents, like sent a screenshot to my boss and been like, look, here it is. And I think there's nothing wrong with doing that and we should be really happy when we see our impact at that level. So I think even seeking out these kind of internal, more activity based indicators can be a good place to start, given that it does take quite some time for strategy to really be implemented and show those results.
Erin May [00:23:25]:
Awkward interruption this episode of awkward silences, like every episode of Awkward silences, is brought to you by user interviews.
Carol Guest [00:23:33]:
We know that finding participants for research is hard. User interviews is the fastest way to recruit targeted high quality participants for any kind of research. We're not a testing platform. Instead, we're fully focused on making sure you can get just in time insights for your product development, business strategy, marketing and more.
Erin May [00:23:51]:
Go to userenerviews.com. Awkward to get your first three participants free. I wonder too about the inverse and sort of. So we're doing this research work to inform our big strategy, which might be a couple of years out, but in the meantime, we're seeing positive things happen. Now from our strategy we set a year or two ago. Now, obviously you would have had to been at this company for a while, but research, let's say as a function, has been there for a while and we can say we hit our revenue targets or this huge multi year initiative, this new product line has been so successful. Research was part of that. Right? So just looking at what's sort of going really well in the business and how research was part of that might be a way to get some signal as well on the impact of research on strategy.
Ruby Pryor [00:24:38]:
Yeah, I love that. And I think keeping that institutional knowledge alive is really important. And if you were the team manager, replacing an outgoing manager, I think in your handsover these are the type of conversations you'd really want to be having. So tell me, how was the team operating at a strategy level? And if you have access to the strategy documents, can you tell me how the team influenced this so that you can keep that alive? Because sometimes I think when people leave, we lose a lot of that knowledge and it might not be super clear the impact that a team had.
Erin May [00:25:06]:
Yeah, you mentioned that the optimization stuff is the easiest to measure, which is great. We can see the impact, but the trade offs there. Do you recommend for folks, when you think about your 1st, 30, 60, 90 days in a job, you often will get the advice to just find an early win, find a quick win. Do you advise that folks do a little bit of optimization work, provided that fits within their job description, just to kind of have those wins in their portfolio? Is that a good tactic in terms of showing your impact, to have some of those moments because they are so measurable?
Ruby Pryor [00:25:42]:
Yeah, I think it's something that a lot of researchers should be doing a lot of the time. I think the work is incredibly impactful and can be really meaningful. And even if you're a real strategic operator, it can be great to just see that for yourself as well. And so you go, yes, I am having this really important impact. It is getting out there into the hands of end users. I think it's a great idea to always have your finger on the pulse of that kind of optimization work if your bandwidth and role allowed.
Erin May [00:26:09]:
Let's talk about quant. What are some things that can be measured that can dollars time? What are some things we can think about in terms of impact of research?
Ruby Pryor [00:26:19]:
Yeah, I'm really happy to talk about this. So I think for me, this quantification idea really started when I was doing a piece of research, and there was a certain moment in this flow that was really confusing to the folks who I was doing the test with. And you always want to talk to people about, okay, so you've come across this, what are you going to do next? And a lot of them said, I would just call customer support. Like, the only way through this would be to get some outside intervention. So I'd call customer support. And it really got me thinking, like, that's not going to be cheap. If we had launched this product and it had got out there into market and all of these people had called customer support, how much would that actually have cost the organization? And so I started doing some kind of back of the envelope maths of thinking about, okay, the market size. Is this what percentage of people would actually call? What's the cost of a call? And you come up pretty quickly with some numbers that are pretty compelling.
Ruby Pryor [00:27:18]:
And I think you can then say, but we did find the issue, right? Like, we tested it with users, we found this issue, we worked with the designers and the PMS to change it. So when it's launched, that issue is no longer present. Therefore, we've actually saved an organization all of this money because none of these calls actually happen. So that was kind of the genesis of me thinking about this in a real quantified kind of a way, and in a dollar value impact kind of a way. Because I would say pretty much everything else I've ever read about quantifying the impact of research and ux stops at product metrics. So by this, I mean it stops at we increase conversion rate, we increase task success rate, we increase the comprehension, and it stops there. And it doesn't go to dollar value impact. And I think this is a real missed opportunity.
Ruby Pryor [00:28:08]:
And I think it stems back from the divide in so many organizations that I see. Like tech organizations, they have the tech side and they have the business side, and never the twain shall meet. Like, often tech stakeholders stay inside of their tech bubble, business stakeholders kind of stay inside their business bubble. And there's not that many people going back and forward between them. Because if you, for example, are working on the checkout flow, and you increase conversion. Like, you increase the number of people that are successfully purchasing an item. That is the clearest impact to the bottom line that a team really can have inside of an organization like that. And so, of course, you could work with the stakeholders who have access to purchase data to actually go, well, how many additional purchases or how much extra revenue came from that.
Ruby Pryor [00:28:52]:
But that can maybe take time to get access to that data. But if you just know the average order value for your organization, like, on average most people spend $50 or on average a ticket costs $300, you can just do the maths. Like, you know, how many extra people have made it through that conversion, and you can just multiply that by average order value. And then you've got pretty good directional indicator of how much more money you've made in organization. And I think that's something that I really want to see more folks in Ux doing is going all the way through this process, not just to a metric, but to a business metric, to a dollar value metric, because that's where people really take a step back and go, whoa, that's actual impact.
Carol Guest [00:29:33]:
I think I'm hearing two pieces to this. One piece is we want to make sure that we are taking the impact to the level of a dollar value, something that is very clear of the sort of overall value to the business. And then I think another piece that you're saying is sometimes we think too much about, maybe you didn't say this, but only the revenue side and not enough about the cost side, is that what you're saying? And maybe are there sort of impacts that people tend not to look to or places where you could quantify a dollar value that's often outside of the periphery of how someone might measure impact.
Ruby Pryor [00:30:04]:
Yes, I think this is right. So, yeah, like, with the customer support stuff, that was, like, avoiding this cost. So it's money that we didn't have to spend rather than additional money that we did make. And I think we do need to be thinking about both halves of this kind of cost and revenue at all times so that we can have the best handle on these metrics. And, yeah, I think the next layer that really excited me when I thought about that prioritization, the idea that you're helping an organization say, we're going to invest and we're going to build this product because they're getting very strong indicators from the market that it's going to be something that people want. We've done our kind of lean, continuous discovery process, and we've got very strong indications that people are willing to sign up and they're willing to spend for it, so we're actually going to invest in it. If you are helping an organization do that and say, we're going to build product a instead of building product b or c or d, that's also money that you're saving an organization, because every time that an organization builds a product, they are spending money. Building something is not cheap.
Ruby Pryor [00:31:09]:
It costs a lot of time in terms of employees salaries. So anytime anything takes like a day of your time, you can actually do the math on yourself and work out how much does a day of my time cost to this organization? If you know that your organization, when they build a feature that you've just done research saying we probably shouldn't build, if you know that feature would have taken, let's say, like four weeks of time to build two engineers, one product manager, and one designer, and you can more or less work out what their annual salary is, you can then do the math on that to figure out, okay, I've essentially just saved the organization $50,000 because we didn't waste time building something that would not have succeeded in the marketplace. And that kind of thinking, before I wrote it online, I don't think I'd seen anybody really talk about that kind of level of thinking, about the deployment of resources inside of an organization and actually how ux research helps optimize that.
Erin May [00:32:06]:
Thinking about some of the examples you're talking about, you can imagine for a civic sort of like evaluative study, right. How we were going this way, we found a problem. We went this way. Instead, we saved money. Because this way, in the example, you listed the call center costs and things like that. If research is done the way you're supposed to do it early and often. Right. And you're doing great discovery research, you might not get that far.
Erin May [00:32:34]:
Right. Meaning we just came up with the right idea in the first place. How do you measure hypothetical thing that you didn't even think about building because you're so good at doing research? Do you know what I'm saying? I think this is the tricky bit with the counterpositive or the counterfactual, the thing that never happens.
Ruby Pryor [00:32:51]:
Yeah, I think this is a fantastic question, and if only that was always the case. Right. So I think there are a couple of ways that we can think about it. I think the first one is for most organizations, that would be a novel approach. Rarely is that actually the way that a lot of organizations function. So I think you can look at precedent about the bad bets that have been made in the past and how much they cost. And you can say, well, due to our generative research, we found the right thing from the beginning. So even if we avoided one of those historic bad bets, this might be how much money that we have saved from going down this road.
Ruby Pryor [00:33:26]:
So a similar technique to what I talked about last time, it's just not as clear like what the other options were, because it wasn't like you were doing research to trade off between a, b and c. It was more like we just found a. But inherent in finding a and committing to a is that you have rejected other options that maybe would have been there otherwise. And so I think you can look at the precinct of how much things might have cost in the past to consider, to try and put kind of a dollar value on that sort of effort.
Erin May [00:33:54]:
Yeah, that makes a lot of sense.
Carol Guest [00:33:56]:
Maybe pull a little bit more on this idea. I think one of the common challenges for UX researchers is this idea that once we have an insight, it's sort of like, oh, well, we always knew that especially a great insight becomes a part of the fabric of how the organization thinks. And it's just sort of what people know when they start there, ideally, and I wonder if there's a way to. How do you make sure the insight is seen as an insight that's produced by research without in a humble and socially appropriate way? Right, I did that. I found that. But I think some things that come to mind, maybe just sometimes, I find that if there's a specific framework, a way or visualize the problem, or a series of customer clips that very clearly articulate the problem, are a couple of ways to distill the insight and also make it somewhat easy to credit to the research team. I wonder if you have any ideas about that.
Ruby Pryor [00:34:48]:
I think those are all really great ideas. I think another one is if this is the case and you actually can prove the providence of this insight and where it came from, if you get the opportunity to meet folks in your organization at onboarding, show them. So show them that. This is the kind of impact that my team has. We went out in field three years ago, we met these amazing folks, and we weren't x to be true about the way people book hotels, and now this is how the organization functions because of that insight. And you could, even if there were intervening steps in there, you could say, and then we build this product and then it changes strategy like this. And here we are now, where everyone just accepts it as true. So I think any way that you can show that journey and get that into the heads of people early on, that you don't just think that because you logic your way to it, you think that because my team helped you think that.
Ruby Pryor [00:35:36]:
I think that can be really powerful.
Carol Guest [00:35:38]:
So it's a bit of storytelling here, where sort of the heart of the story is we learned something different, and now we think about the world in a different way.
Erin May [00:35:46]:
So you can imagine working on different kinds of research and thinking about the quantifiable impact of it in terms of how do we get to revenue, how do we get to dollars? How do we get to dollars saved or dollars earned by working on this thing? What about stacking that up over the course of, say, a year, or a researcher's body of work or a team's body of work and translating that into budget for more research?
Carol Guest [00:36:11]:
Right.
Erin May [00:36:11]:
How do you kind of add all those things up?
Ruby Pryor [00:36:14]:
Yeah, I think this comes back to the sense of keeping really good records of what actually happened off the back of your work, because even some of these dollar value metrics, again, might take some time before we can actually link them back to the insights because they might require something to actually change in market. So I think it's a case of just staying really close to your stakeholders, understanding what decisions have actually shifted, and then keeping really good records. So having like a spreadsheet or an air table or a notion or whatever it is where you list out here are the projects that my team has done. Here's an impact statement, and part of that might be qualitative. And here are some of the metrics that have shifted off the back of that, so that every quarter you can sum it up and say, this is the type of increase in revenue that the team has contributed to, and this is the kind of reduction in cost that the team has contributed to. And I think it's fine to be quite kind of collaborative in how we talk about those things, because for anything to actually happen, it often isn't research alone. It requires design, it requires products. It requires lots of the collaboration from other folks.
Ruby Pryor [00:37:14]:
So I think you can be quite generous in how you describe that kind of impact that you're making. And even one step further than that, I would say bring the teams, other teams into these discussions from day one, like at the scoping stage, you want to be saying, hey, if we do this right, we could be saving the organization money in terms of x, y and z. We could be generating additional revenue in terms of x, y and z. What metrics do we need to record right now so that we can prove the impact of this project that we're all about to go on together? And then at what intervals do we need to come back and collect that data? And you might get some awesome ideas from folks in that room for other metrics that you hadn't even thought of yet. And I think the more that we can bring this discussion into all of the other stakeholders in the tech side of the business and getting this kind of business acumen, this kind of business thinking in those rooms, I think it'll just lead to richer conversations. And then they were part of it, and so there's no proving it back to them because they were there too from the very beginning. And the more that we can get into this sense of being part of a team rather than always lobbying to someone else, I think that the richer our working relationships are going to be.
Erin May [00:38:17]:
What, have you seen the most convincing arguments of impact in different kinds of organizations? Because I know one thing that comes up when we talk about research impact is that every organization is different. There are all these different kinds of research, there are different kinds of stakeholders who are motivated by different arguments. But is there something that you see consistently that is quite compelling in terms of this is the impact of research and how to kind of talk about it and sell it internally?
Ruby Pryor [00:38:45]:
I think a really great question, and I feel like it's going to come back to the same thing of it depends.
Erin May [00:38:50]:
It depends.
Carol Guest [00:38:51]:
Yeah.
Ruby Pryor [00:38:51]:
Even when we're talking at these kind of dollar value metrics, some stakeholders and some business contexts are going to be much more excited by growth metrics, revenue metrics, and kind of cash flow metrics, and if others are going to be more excited by cost reduction and profitability focused metrics. I think particularly when we're talking at this kind of dollar level and like money level, a lot of it will depend on the macroeconomic conditions that you're operating in and what the drivers around your business are. So businesses that are very growth focused, I'm thinking early stage startups, they're going to be a lot less interested in reducing costs because that's not normally what those businesses care about so much, whereas a business that is more mature might be more interested in having a little bit more balance around the profitability. So I think there are some kind of broader factors as well that will influence what type of conversations are going to be more impactful.
Erin May [00:39:43]:
You talked a little bit about the checkout experience example, which, near and dear to my heart, worked on growth teams. We did a lot of monetization work. Is it hard for researchers to find opportunities to get involved in such dollar? Right. That's like an ecommerce example, potentially, but to find those kind of opportunities, to really get close to moving the needle on some of those kinds of metrics.
Ruby Pryor [00:40:09]:
Yeah. Again, I think it would depend a lot on the organizational dynamics, but I would like to think that organizations that are really focused on increasing their revenue are going to put the best resources they have on the flows that are the most critical. And so if an ecommerce organization isn't investing lots of research effort into optimizing that checkout flow, I think that's a very interesting strategic choice that they're making.
Erin May [00:40:31]:
Well, we could dig in a lot more to a lot of topics on impact. What do you want to leave our listeners with in terms of how to get started, how to really move the needle on thinking about the impact of their research?
Ruby Pryor [00:40:47]:
Yeah, I think my overarching thought to folks would be, number one, take credit for your work. I think a lot of researchers, because we're so cognizant of all of the moving parts that have to coalesce for impact to be made, end up underplaying our own involvement. So take impact for your work. When that checkout flow is now amazing and converting 10% more people stand up and say research was part of that. We helped that effort. So that's, number one, take credit for your work. And number two is connect your work to those business metrics. So don't stop at product metrics.
Ruby Pryor [00:41:22]:
Think about what those business metrics might look like. And if you feel a little bit out of your depth, maybe having that conversation, find some business stakeholders and chat to them about how you can do that work together. I think a lot of folks inside of an organization are really keen to understand what is the impact of tech teams altogether. And if you can be the bridge between a tech side of an organization and a business side of an organization in facilitating those kind of conversations, I think that's a really kind of powerful and interesting position to put yourself in.
Erin May [00:41:50]:
Well, we want to do a rapid fire, but we also know that you work in Singapore. And so do you have any quick thoughts about what research is like internationally in Singapore that might be interesting to folks as part of our kind of rapid fire section here at the.
Ruby Pryor [00:42:07]:
Yeah, sure. Like, I feel so lucky to have the opportunity to work in Singapore and in Southeast Asia, and I've done UX research in most of the major economies in Southeast Asia, which has been really eye opening I think there are some really important frameworks to consider doing research in kind of anywhere in the world that I found really helpful to apply to my practice in Southeast Asia. So I'd highly recommend the culture map by Erin Mayer. It's a fantastic book, and in it, she talks about a couple of different spectrums that cultures can exist along. And one that really resonated for me is high context to low context. So low context cultures are very direct. They say it how it is, and you don't have to read between the lines to understand meaning. The US is one of the lowest context cultures in the world.
Ruby Pryor [00:42:52]:
Australia, where I'm from, is also very low context. High context cultures, on the other hand, are much more likely to be indirect. You're much more likely to have to read between the lines. They might use metaphors more often, things like this. And a lot of cultures in Southeast Asia are much more high context. And what this has meant for my practice is that you have to kind of really listen to what people are telling you. So when I've worked in Australia, I feel like if you put something in front of someone and they didn't like it, they'd say, I don't like this, or, I don't understand this. Whereas in Southeast Asia, I think it was much more common for people to say things, hmm, this is interesting, or people might not understand this.
Ruby Pryor [00:43:30]:
Like, people might struggle with this. And to understand what is it that someone is actually saying to you in that moment without expecting to get a really kind of crisp and golden quote that says, word for word, exactly like the thing out loud, that there might be other ways that folks kind of approach the same idea and are kind of giving the same level of feedback. But just to my kind of low context brain, can kind of sound a little bit different. So I think educating yourself about how different cultures communicate has been really important for my practice and something I'd really encourage everyone to have a look at.
Erin May [00:44:08]:
Awesome. That's a great taste of what it's like to be in Singapore. We'll have to talk about it more in the future sometime. All right, quick, fun facts about you. Let's do rapid fire. Tell us something interesting. Yeah.
Ruby Pryor [00:44:19]:
I'm a big traveler, so living in Singapore means great access to Southeast Asia. They visited ten countries last year, most of them in the region. And in particular, I love visiting Thailand. I went seven times last year, and I've started learning basic Thai. It's a tough language, so very, very basic.
Carol Guest [00:44:37]:
I also saw that you do a lot of sewing. Any favorite recent things you've produced?
Ruby Pryor [00:44:42]:
Yeah, I love making gowns. I like making kind of impractical dresses out of impractical fabrics like velvets and sequins. I think they're the most fun to work with.
Erin May [00:44:52]:
What's your favorite user research interview question?
Ruby Pryor [00:44:55]:
Probably not a question, but I love saying to people, tell me more, where.
Erin May [00:45:01]:
Can folks find you on the interwebs on LinkedIn?
Ruby Pryor [00:45:05]:
I spend a lot of time on LinkedIn, so you can find me there. You can just look up rubyprior. You can also find me on my website, rubypriar.com. Or you can find Rex at Rex ink.
Erin May [00:45:14]:
You've been doing a lot of speaking and events and things like that, right? So folks probably find you at an upcoming event. Do you have anything planned?
Ruby Pryor [00:45:20]:
I do. I have two events coming up soon, so I'm delivering a workshop in Bangkok for Strat, which is a conference all about kind of ux and strategy. And I'll be speaking at UX Copenhagen in March as well.
Erin May [00:45:35]:
Wonderful.
Ruby Pryor [00:45:35]:
Can I just give a little shout out to my other people on LinkedIn?
Erin May [00:45:38]:
Yes.
Ruby Pryor [00:45:39]:
So I get a lot of my learning, I would say, from consuming bite sized content on LinkedIn and a couple of great thinkers out there. Emily Anderson, Julian dela Matia, and Caitlin Sullivan. Always writing awesome stuff about strategic design and UX research.
Erin May [00:45:53]:
Awesome. We'll link those in the show notes for everybody too. And I know everyone loves their crap on LinkedIn, but there's some good thinkers out there saying some thoughtful stuff. I see it all the time.
Ruby Pryor [00:46:03]:
Yeah, I found it like a truly amazing community and since starting to post more regularly, I've met some really fantastic thinkers.
Erin May [00:46:12]:
Well, thank you so much. I've learned a lot. I'm sure everyone listening has as well. Have a great day all. Have a great night and thanks for being here. This is great.
Ruby Pryor [00:46:21]:
Thank you so much. I had an awesome time. Thank you. Thank you both.
Carol Guest [00:46:25]:
Thanks so much. Rivy, great conversation and so, so relevant right now. Lots of great tips for folks to take home.
Erin May [00:46:30]:
So love it.
Ruby Pryor [00:46:31]:
Awesome. Thank you. Speaking again soon I hope.
Erin May [00:46:40]:
Thanks for listening to awkward silences brought to you by user interviews theme music by fragile gang hi there, awkward silences listener. Thanks for listening. If you like what you heard, we always appreciate a rating or review on your podcast app of choice.
Carol Guest [00:47:05]:
We'd also love to hear from you with feedback, guest topics or ideas so that we can improve your podcast listening experience. We're running a quick survey so you can share your thoughts on what you like about the show, which episodes you like best, which subjects you'd like to hear more about, which stuff you're sick of, and more just about you. The fans that have kept us on the air for the past five years.
Erin May [00:47:24]:
We know surveys usually suck. See episode 21 with Erica hall for more on that. But this one's quick and useful, we promise. Thanks for helping us make this the best podcast it can be. You can find the survey link in the episode description of any episode, or head on over to userinterviews.com. Awkwardsurvey.