#12 - Essential Times for Qualitative Research With Shipra Kayan
Erin: Hi everybody. Welcome back to Awkward Silences. Today we are here with Shipra Kayan, and she is the head of design research at Upwork. She's been there for about 10 years, and so she has a wealth of experience with managing all things, research and design, and user insights at growing and iterating and evolving a team and company. So we're so lucky to have you here today. Thanks for joining us.
Shipra: Yeah. It's good to be here.
Erin: Great. So let's jump right in. So as we discuss you have been in this role for a while, and I've seen a lot of iterations of what design research look like over the years. In that time, how have you thought about design research when you could be running any kind of study to generate the insights that are gonna be the most useful for what you're trying to accomplish from a product roadmap perspective?
Shipra: Yeah. I mean, so my background isn't necessarily purely design research. I did UX design. I was a product manager for a few years, and so I do approach design research with a very, very business strategy background. And making sure we're doing the stuff that moves us forward, and this can sometimes be a little tactical. Especially as you're just starting out growing a research function in a company, you can tend to focus on more usability or validation studies, but over the years you really need to start thinking about what's gonna be happening like two year, three years from now. So I'm really lucky having been at this company so long that we have a research team where we spend a good chunk of our time on Horizon 3 projects that are gonna impact the company two, or three, or five years down the line. So we're hoping to spend like 20, 30 percent of our time doing those sorts of projects. The rest of our projects tend to be discovery strategic projects based on UX roadmap.
JH: So that's all one team who's doing the more short-term stuff and the longer term Horizon stuff as well?
Shipra: Yeah. Yeah. It's all one team. It's a team we've built over the last three years. So it's not a very established or old team. We're still figuring out what the company needs and how the team can work with the rest of the company, but it is the same team. And the model we've chosen is to have a couple of people or one person really focusing on those Horizon 3 topics versus everyone checking on their work that way, but yeah.
JH: Cool. What is research that far out actually look like? Is it like your favorite discovery research or is it something totally different?
Shipra: Yeah. It's totally like discovery. Something as our company might want to think about is how virtual reality might impact work, right? But we're not doing that. That's not the type of Horizon 3 research I'm talking about. Upwork is a platform that connects freelancers with jobs, and we're still thinking about what do freelancers need to feel supported. And even if we're not gonna build it this year, how can we impact our business strategy moving forward, and do we need to go beyond the services we offer freelancers today? So that might be something that we think of as Horizon 3 project.
Erin: What do you do with that research? I'm imagining you have your Horizon 3 research team that's focused on that kinda stuff and you've got these amazing insights that don't fit in any box of your current products or any roadmap you currently have. Where do you stick that stuff?
Shipra: Oh my gosh. Like I said, we're just starting out. I think we have a lot more experience doing discovery for sort of something that we're gonna try and build this year in the next couple of quarters, maybe later this year, right? We have a lot more experience taking that research into roadmaps. So that the broader thinking or the far out research, it ends up being sort or conversations that need a lot of time to settle, and trickle down, and get reframed, and repeated. And so I don't imagine that it's gonna really be like this, “Oh we did this two month study and here is kind of a three year strategy.” It's not that easy. It's just more sort of conversation starters. Whereas I think the more kind of near term discovery projects have much more of a process of taking the insides and kind of thinking about what do we do on the roadmap.
JH: I just have to ask the product question of who determines what's the short-term research and what's on the Horizon, right? How do you draw the buckets of this idea is something we'll get to maybe later, this is something we need to get to soon? That whole process in and of itself seems really challenging.
Shipra: Totally. So how do we set our research roadmaps?
JH: Yeah. How do you know what's a Horizon idea versus something more near term?
Shipra: Oh. Yeah. So basically, my job, which used to be actually building stuff when we were smaller, has become really sitting. A lot of meetings. It sounds boring, but ... So that is my job. So we have all of our usability or more tactical validation studies are built off of whatever's on the roadmap already. So I would say the PMs and the designers get to really dictate that, and I don't try to mess with that. I just make sure we're prioritizing those studies out to the right level. I think the discovery studies, even with kind of what might impact the roadmap now or this year, it tends to be either something that a PM or a group of PMs already are thinking about or it could be a question that no ones asking, but a set of assumptions that I see flying around in multiple teams. And then I'm just like, "Okay. Wait." All of these teams have different assumptions about why a certain segment of customer doesn't sign up.
Shipra: Then that's kind of my job is to sit in all different team readouts and be like, "Okay. Wait a second. What if we just kind of pulled all of these assumptions together into one study? And how could we change then the roadmap of these particular teams?" So that's kind of my job. In terms of Horizon 3, that's more meeting with leadership one-on-one and just sussing out what is it that we could do that they're open to listening to, right? Because they're all like, "Now we're a public company." They're thinking about our earnings. They're thinking about different things, and it's hard to describe. I think it's just me sussing out what is it that they will be open to listening to and what is it that maybe three years down the line we'll be ready to tackle.
Erin: Yeah. I love that. If I told this to someone, who would listen? Which is like the precondition for action. Even if that actions not gonna happen for years, will someone listen to me? I think that's super valuable for ... a lot of people we're talking to are looking for how do I get more impact out of the research I'm doing, and that feel like a useful starting point for a lot of cases.
Shipra: Yeah. I mean, as long people are listening, research is important. When people don't listen that means you're just off researching the wrong thing.
Erin: So okay. So we've been talking a lot about kind of Horizon 3 and really strategic research, but let's get back to the 60, 70 percent of what you're spending your time doing, which is the stuff either right in front of you or within kind of the year ahead of you. And how you take all of this insight in and then turn it into something cohesive that looks kind of like a roadmap? How do you get from point A to point B?
Shipra: Yeah. Just get people in a room and hash it out.
Erin: In 30 seconds. In 30 seconds.
JH: Yeah. I actually need help with this. So if you can stay on afterwards and ...
Shipra: I mean, it's a work in progress, right? But I mean, for me, the one thing that I truly believe in is workshops, or co-synthesis, or co-working sessions with multiple people because there's no way that a researcher can synthesize a study to impact a roadmap the same way that a PM designer, an engineer, and a researcher could synthesize the same study. So I'm really really big on workshops and kind of want to build the team that can facilitate sense-making together. And so that's really kind of the philosophy with which we approach this process, and we're not perfect. Nobody is, but maybe I can tell you about a few things that we do on our journey. Actually, it starts before we do any research. And every time I've skipped this step I personally in a recent project didn't list a hypothesis from stakeholders, and I wasn't even able to write a report.
Shipra: So I think it really starts from before you conduct any research to get ... from Intuit we borrowed the fog, which is facts, opinions, and guesses framework, but it's also called assumptions, gathering, or hypothesis gathering workshops. It's getting everyone in a room and saying, "Okay. Here's the topic we wanna learn about. What do we know factually based on our data or past research? What are some things where we have strong opinions that are rooted that we can agree on as a group are probably kind of true? And what are some guesses that we hope are true perhaps or that these are the assumptions on which we're gonna base our strategy or our roadmap?" And that's where, for me as researcher, it's kind of opens up my eyes to what is it as an organization that we're not sure about. How is it that we might be wrong? And what are the foundational questions that we need to answer?
Shipra: So that's kind of where we start. I know it sounds very meta, but that's where we start everything we create. All our documents are ordered, so prioritized. So anytime we have key questions, assumptions, we prioritize it. It's because you can't do everything. It really helps once you do have data to do a co-synthesis session where everyone can look at the data together. I can talk about how we do that if that's interesting or I don't know if you guys have any reactions.
Erin: Yeah. Let's talk about it. If we could take a step back, right? Because I always come back to hierarchies and architectures because I don't know that my brains works. So we've got on the one hand, we've got this workshop which is a way of kind of making sense out of things. And then we've got a study, an inquiry, fog. We've got these things we wanna prob into. And then somewhere over here we have a roadmap we're trying to get to. How do all those pieces fit together? Where are we starting? Are we starting with the goal of building a roadmap? We think of it in terms annual roadmap. I don't know. It's obviously a never ending thing, and there are different roadmaps for different products. So what are you trying to get and how do these pieces kind of fit together?
Shipra: Yeah. That's such a good question. I'd love to hear what JH thinks of this because it's always different perspectives, but there's three places we've started. The one that I loved to start at is ... it's usually the prior teams I think in most tech companies tend to be the core of roadmaps. They're the heart. So one place we might start is a product manager knowing that we need to solve a certain problem. So we start with a problem. I'm trying to think of what examples I can give you that don't give away our company strategy, but I'm just gonna make stuff up.
Erin: Let me ask you a question.You usually start with a business problem or a user problem or are they one in the same?
Shipra: I think they are very related. I think it's usually a business opportunity. Maybe not a problem, but let's say one opportunity is hey if we wanna serve larger businesses, we need to fit into the budget control system, right? You might call it a business problem or user problem. It is kind of a problem that both the customer has and we have. We're not gonna succeed, but we don't know what to do. We know we need to fit in there, but we don't know what to do there. And that's a really nice place for you to bring in a researcher and really work through where the things that we need to solve for here. Knowing that we need t make this whatever corporate controller happy at every customer company.
Shipra: I think the other place we start at is a feature, right? So one example that I think I can tell you guys about is ... at one point we said, "Oh my gosh. So many interviews get scheduled over messages and people going back-and-forth. We should have an interview scheduling feature within messages. It's super cool. You just say, "Hey, I'm free on Saturday." And it becomes a little doodle ball. So we're like, "Oh my God. This feature is amazing. Let's build it.” And we start from a feature, but ended up just I think there were some questions from leadership on is this really a priority? And so we ended up taking that is this really a priority and being like, “Okay. What's the user's goal within this whole process, how important is scheduling interviews? And is that really something that's delaying their task completion by two days or doesn't everyone have Calendly?" I don't know. We were like, “Okay. Let's really figure this out.”
Shipra: So those are, I think, two extreme places we usually start with. Where else? I think obviously the validation stuff is pretty simple. That's usually some sort of discovery has been done, there's a feature on the roadmap, and we need to do a study. That stuff is pretty straightforward and we don't need to challenge every study that comes our way.
Erin: Got it. So in the case of the calendar integration and booking. That might be something that wasn't on a strategic roadmap that came up opportunistically and was validated through a survey, then it gets put on the roadmap. Whereas, I'm making assumptions so tell me if I'm wrong, whereas the issue of we need to have better options for the controllers of larger businesses. Now is this strategic thing that maybe was already on the roadmap, and you kinda wanna get in there and figure out what that looks like? Is that kinda how that-?
Shipra: Yeah. That's kinda how that goes and with the calendar thing, basically we went back. We didn't do a survey. Actually, we kinda reframed the question to what are the things that cause unnecessary delays in hiring somebody. So we rephrased it because that was the assumption of the problem they were solving, and then we just went out and did a very qualitative study. And that item kinda went off the roadmap, other things went on as a result of us, yeah, us stewing another inquiry.
JH: Yeah. That's awesome to hear because I feel like a big piece of this is ... I liked a couple things you said. One is how to make use of the insights at the end kinda starts in the beginning by getting all the people together and actually agreeing on what you're trying to learn, and then dealing with all the artifacts that come out of it is easier because there's kinda firm alignment. But then the other piece is the reason for doing this is because there is uncertainty and you're trying to reduce it and learn more about a given user problem, or business strategy, or whatever it may be. Throughout that whole process there's acknowledgment of the uncertainty, then that group is comfortable with the idea of abandoning it. The outcome can be there's nothing here. Let's put something else on the roadmap.
JH: Whereas, if it's not framed that way then there this inertia to be like, “Well, we gotta do something with the scheduling because we said scheduling, and we've been doing this research and we have this pile of insights. So let's figure something out.” And that's in my mind just not the purpose, right? The purpose is to kinda find that fork in the road, and so I think the way you're describing that process seems to allow for that. Is that kina what you guys are trying to do basically?
Shipra: Yeah. I mean, that's a really great way to frame it I think. I think, yeah, the acknowledgment of we are open to learning versus we just need to check this box is super important. And we're not a big enough team that we can afford for research to be checkbox. So we wanna go into places where we're uncertain, and we don't always make the right decisions. I'm sure there's stuff where we were more confident than we should have been about an idea because it seemed really straightforward, and I've done this. Have it be a researcher myself, I've definitely launched features without researching that have failed, and then you research afterwards to figure out why did that fail? And how were we wrong and what can we learn? So I think it's, yeah, it's an acknowledgment that we have something to learn, which yeah, you're totally right.
JH: Cool. I feel like there's a way of describing this where it's a PM, a designer, a researcher, and a developer all walk into a room together and totally figure it out. They all work together. It's great.
Erin: And then the marketer comes in and says, "Guys, why didn't I know about this?"
Shipra: Yeah. Get the marketer in the room.
Erin: So when in that process you talked about the workshops. I wanna hear more about that. So how throughout the process of sort of problem defining and assumption naming do you get folks involved in workshops and other methods of kind of co-figuring stuff out?
Shipra: A lot of kind of the workshop process is really about democratizing a little bit. So it's not the PM, or the VP, or whatever having an opinion. It's really about us writing things down, creating a framework, rationalizing things. So at the end of the day, we don't have one person with the one take away who listen to one user. And so that's kinda one way that we did it where everyone was there, and we just did an infinity and we were like, "Here's the two or three things I'm thinking. Does that seem meaningful?" And we did two-by-two with impact and effort, like right NBA. That's one example. I think for a different type of study where it's a longer term study, it's more let's say 20 people are peripherally involved. We might do something like having people read a couple of transcripts each and come in and do a round robin where each person who's part of this workshop or part of this prioritization meeting have a couple of hours of empathy where each person kind of introduces the user that they read about, and what they took away? What surprised them? What was their ah-ha moment?
Shipra: And so we go around introduce the user. So we always start with this raw data and empathy, and then build towards a framework for how we're going to prioritize the stories, or the problems, or the opportunities that we found in the insides.
Erin: It sounds like you're talking almost, at least in the first example, like a sprint of sorts. How do you determine which people are going to get involved in which kind of way be it affinity mapping or sharing their stories of empathy? Is it based on time available? Who sort of signed on to be part of this sprint? Who gets involved and how, using what methods? How do decide that?
Shipra: I think for a smaller study, it's like a product like manager which is a PM, a designer, engineer, researcher, product marketing manager. So I think for something more contained it's the core product team. For something broader, it might be folks from legal, and operations, and people who generally may not even be involved in interviews, but they are involved in developing the policy, or the services, or the products that we want to effect the research. So for a broader study, it's kinda very much figuring out who makes these decisions and who do we want to involve.
Erin: Shout out to legal out there. I think that was our first ...
Shipra: Literally, the most innovative legal team I've ever ... I worked with so much, and they're amazing. They understand design thinking. They get it.
Erin: That kinda warms my heart.
JH: That's cool. It also makes the lame joke I was trying to set up about a PM and somebody else walking into a bar or whatever. If you throw a lawyer in there, now you're really cooking.
Erin: No one knows where it's going.
JH: Yeah. Exactly. Now you've opened up a whole new world of possibilities. This all makes a ton of sense to me in terms of you get people involved in this process, this workshop and kinda be in the room together and do that affinity and do these other exercises. Do you think about how to take any of the key distillations from that process and make them accessible to the broader organization or is it we've gotten a good size cross-functional team involved in this topic and the idea of getting it out further is not as important? Because I was wanna democratize and share it all, but that part's really hard and if you had all the right representatives in the room and they all know it, maybe that's enough. How do you think about that?
Shipra: Oh my gosh. I know how hard it is to document because no one wants to do it. Unless it's a very tactical validation usability study, everything needs to be documented. I think it's super important because having been her 10 years and now on my way out obviously now I think it's even more important, but I think maybe it depends on the culture of the company. Obviously, I haven't been anywhere else in a long time, so I don't know, but here we tend to be readers and I think really sort of thinking about what are the 6, 7, 8, 10 sort of studies that are gonna be referenced a lot in the future. I think writing everything down is super important. So yeah, we do write and we don't do a lot of presentations. We do a lot more sort of write out a deck, or a memo, or whatever it is and people read it. Even during some of our actually review meetings where we're all reading, and commenting, and then discuss the important things that came out of it. So I think we're a reading culture. So I definitely think documentation is important.
Shipra: But I mean, here's the other thing that I see a lot that kind of is killing me right now. I feel like in the room there's so much rich conversation that when you read about it, it seems to be distilled into something really trite. For instance, I'm kinda making this up, but instead of saying the problem is that Mary knows nothing about plumbing mechanics, or costs, or common issues with plumbing. So she's worried about being ripped off when she hires a plumber, right? That's a really rich story. And then what lands on a deck might be people don't trust plumbers on our platform, so we need to build trust, right? And then that's not quite as inspiring to a designer or someone building a solution, but I think what the people in the room are trying to do is they're like, "Well, we want to solve for Mary, but we also want to solve for I don't know, Ann who is hesitant to invite a mechanic into her house when she alone, and we're also trying to solve for Brad who's blah, blah, blah.
Shipra: So I think this is kind of the translation from rich data to something really trite that then is as meaningless as where you were before the research. I think that's a problem in documentation that I wanna get over where we're really documenting some of these more inspiring ... out of the hundred stories, what are the two, three stories that we should be solving? Or one. What is the one story that we should be solving versus us having loss version and trying not group all hundred stories into one problem statement that doesn't mean anything. So I don't know. For me, that's the battle I'm fighting in terms of, yeah-
JH: It feels a little like the undercurrent or the thread through all of this is editing is really hard, right? We think of editing in writing a story and somebody edits for me or whatever, but it's also between good and great designers. The great designers know how to remove more and more of the non-essential stuff and keep editing it out and leave you with this really plain great usable thing. Taking all these insights and all these stories from users and editing them down to a plan and a roadmap is really hard, and I don't know. I don't think we talk about it that way so much, and there's something about the story element of the way you just mentioned it that kinda triggered it for me.
JH: Because it's often a lot of focus is about saying no or tying stuff to strategy, and what you're really trying to do is take this collage of all these data points and all these cool things and pull out the right ones, the essential ones and kinda thread that narrative. And I don't have a great point here other than it's really hard, and I think that's an under discussed part of all this.
Shipra: It's an art at the end of the day. As much as we'd like to think we're a science based, I mean, the research process is fairly rigorous, but yeah, roadmapping is an art.
Erin: Art and science, dichotomies are tough. Art and science go together more than they don't, I think. Right? I was thinking that your emphasis on kind of workshopping and bringing people together feels like a pretty good hedge against this potential trap of the kind of solo artist genius sitting alone in a room and making sense of this morass of qualitative data, right? Bring some other people in and see if you all get to the same conclusion together, so that obviously a trained professional researcher is hopefully not going to let unconscious bias, and confirmation bias, and all of that draw them to the wrong conclusion. But it does seem like a pretty good way to not only get by in, but to kinda hedge against one person. Maybe you're just tired.
Shipra: Researcher bias.
Erin: Yeah. Bring in some other people and see what you en up with. I think to me that feels maybe a little bit hard to do all the time at scale for every interview, but it seems like a pretty great ingredient in your toolkit to get some better conclusions.