#8 - Continuously Delivering and Discovering with James Aylward of Pluralsight
E8

#8 - Continuously Delivering and Discovering with James Aylward of Pluralsight

Erin: This is Erin May.


JH: I'm John Henry Forster and this is Awkward ... Silences.


Erin: ... Silences.


Erin: Hello, everybody. And welcome back to Awkward Silences. We are here today with James Aylward and he is the SVP, the head of data products at Pluralsight. And he leads a really interesting team there that is dedicated to continuous discovery, which leads to continuous delivery. And so today we're going to talk about how you systematize creating these continuous insights to shift deliberately and with intention across a variety of autonomous teams. So thanks so much, James, for joining us.


James: All right, thanks so much. I'm really pumped to be here.


Erin: Fantastic. So lay the background for us. So you're leading this team and you've really made this continuous discovery a hugely important part of how you ship product at Pluralsight. So why is that the case and what does that look like?


James: Yeah, so I think there's a bunch of different elements within that. The key of all our drivers, we're looking at single-item flow efficiency. So how do we get units of value as quickly as we can to our customers? And that from an engineering viewpoint, this is where it's sort of continuous delivery was built from was how do we continuously deliver code as and when we've developed it. And in order to do that, you really need a flow of continuous discovery too. You need to be able to identify customer pain-points and work out ways to make those better and also work out what opportunities there are for us as an experience to help with our overarching mission. And for us, that's to democratize technology skills around the world. So we have our north star there and we have the themes behind how we deliver.


James: That's what we made as our whole framework of Directed Discovery in within Pluralsight, which was built largely by Nate Walkingshaw, chief experience officer. And we enable that throughout the organization in order to help us scale. So we have a way of doing it, right? So it's a broad framework of building product that gets value as quick as we can. We've built that with intention and really focused on our practices to enable scale. But it's also the teams don't have to have a whole conversation about how do we build. They can really focus on the needs of the user and get to what we're really driving at, which is shifting value. So within my well, we have six autonomous teams that are small autonomous groups. We call them product experience teams that consist of a product manage, a UX designer or two, finance, and a full-stack devs. We've got data scientists where appropriate and machine-learning engineers and they're supported by DevOps people and product marketing professionals as well.


James: So these small groups can push to production all day long, and they do, and they build experiences that, you know, there's not a steering committee or they don't have to seek approval through me. There's not a chain of power points that they need in order to launch something. They have the full autonomy to ship production. So that's how it's set up from the high level. Critical to this is the culture we've built around the teams and Pluralsight's developed quite intentionally. So we have a vision of what we want to achieve in the world, and that's to democratize technology skills. And that seems so, I don't know, so succinct, but that's the beauty of it, right? So every single decision we make trickles down from that overarching mission.


James: And once you've identified that and people can buy into it, everything else just strangely falls into line. It can feel messy on a day-to-day basis, but we know where we're going. And we have that as an overall vision, but we're also trying to hit a goal for where we're going for 2022, and then we back in through an OKR process to where we want to be this quarter. So the teams have an idea of an objective and they work out what key results they want to be able to drive in order to meet and measure that objective, and it's up to them to create a possibility to that objective.


James: So everything's within the team's power and span of control. And that's the way we love it, right? So we want the team that's closest to the customer to be able to make that decision and direction and build.


JH: From a practical standpoint, how much variance or different ideas get explored to reach some of those objectives? So, objective A and a couple of key results that align to it. Do you have a pretty good sense of the types of ideas the team's considering to impact those key results and help achieve that metric? Or is it truly pretty open- ended, and they're going to go off and do that discovery as part of the process? And they might come back with ideas you hadn't even considered, but seem really aligned with what they're tasked to do.


James: Yeah. So, I mean, the reason we are able to confidently enable these teams is that the director of the discovery process really outlines the methodology or the step-by-step process through which we achieve that understanding of where they're going in position was actually the right one. So, and going in position is often biased, right? So we have a takeaway. We think we know the answer. But more often than not, we realize quite quickly what we thought was a good idea wasn't. And that's because we talked to the customer, right? So within the first step of that, like after we've set up, and we understand our objective, the first step is really to understand which one of our personas is do we want to target this product for? And then we go off and talk to people in that persona group.


James: So whether that's learners or technology leaders or a certain type of learner, have differences in behavior between structured and unstructured learning. So novice and expert learners have totally different methodologies of how they learn. And that's actually, you know, we derive that from inside, from talking with customers or also observing our click-flow traffic, and looking at it from a quantitative and a qualitative viewpoint.


James: So once we've identified like what type of learner and which group and are they novice or expert or whatever group we're looking for, we go in and talk to them. We have voice of the customer segments and it depends on how big the feature is or how certain we are, but we're always talking to customers all day long. So that's the first step, really understanding the problem.


James: And then we prototype. We're trying to build an experience that helps us learn, and our rate of learning is never greater than when we start prototyping, right? So we can actually show our intended task in the group, what we've got in mind and be able to derive insight from that and build out a synthesis about learnings that builds into something that eventually will be 80% confident that consumers will like.


James: And that's our internal gating mechanism to work out where we want to build. Now crucially in that prototyping stage, it doesn't mean we're not building code and we often are, particularly with data products, like, so with data science, it's very difficult to do that on a paper-based approach or a envision sketch. We need to actually make some of the, hook up some of the data. And we are fortunate, blessed to work with some of the greatest data scientists I've ever worked with in my career here that can build prototypes really quickly, to be able to prototype a system that can automatically recommend to you what the next best course is or being able to look at the various tags in a transcript from a course and be able to identify the key part of the course that we need. Not the whole course, which bit do you actually want to look at? So they can build the actual prototypes.


James: And then, of course, we put some level of low fidelity UX on top of that and be able to identify what our key research topics or research themes develop for that prototype. Once we feel like we've got something that we're 80% sure has value to the customer, then then we move into the build stage. And that's more of a, you know, a build to earn, I guess, rather than build to learn.


James: But that's a little bit more merchandising than we're looking for. Really we're trying to help everybody learn, so we're building to help everybody else learn. So we can then release to production. And we do it on a pre-alpha, which is the internal people having a go at a feature, and then an alpha, which might be anywhere like anywhere from 1% to 5, maybe 15% of traffic. If that goes well, then it goes up to what we call beta, which is half the traffic. And then they go to general release, so it goes out to everybody. But even in that process we can learn and if the metrics that we're looking for start going the wrong way, we can pull it back and understand what went wrong, and then identify whether there was an issue with whatever it is.


James: But we'll add that to our learnings as we build out to production.


James: But after we're in production, we do this quantitative analysis. So this is how we're measuring our customers' happiness, measuring their engagement. Are they actually doing what we set out to do, whatever our hypothesis was, the key result we wanted to move, are they doing that and is that working in production as we hoped.


James: And the last thing is we're constantly iterating. So these teams are sweating on the results and looking through 'em and working on, hey, is this one exactly right or wrong, and then reaching out and talking to customers again, a cycle which begins anew. But it's adding that continuous discovery to the continuous delivery approach from engineering that makes direct to discovery such a powerful process.


Erin: So there's a lot there. So, the first thing that's striking is it sounds like you have a very set kind of framework with steps that follow a particular sequence. I'm going to make the assumption that those steps can take a long time or a little bit of time depending on what you're building. Do you always follow each and every one of those steps each time? Is that consistent, whatever you're building?


James: Yeah, so, it depends ... Everything in product development depends, but we're very rigid on talking to customers. We'll always do that. Pretty much, and more and more it's becoming easier and easier to do each step. So there's an assumption there that it takes a long time. It really doesn't have to, and it depends on the level of the feature, or the size of the impact we're trying to make on any given feature. So yeah, we do struggle up and down depending on certainty and issues. But what we're really trying to do is get that value out there quickly, so they can learn it. So how do we de-risk that or make that into the smallest unit of value so that we can free up that team to push as needed.


Erin: Sounds like you're talking to customers several times throughout that process. When you said, you know, "we," we are talking to customers, I'm curious who on the team that might be that's doing the talking to customers. Is it a particular role or is it everybody on the team or does that depend on stage? What's that look like?


James: So that's a great question. I'm glad you brought that up, 'cause that's also really special about Pluralsight. Often it's the UX lead leading it or the product person, but we always have an engineer in the room with every ... and it might be different engineer or a different data scientist or what have you, but there is always someone like that in the room when we are doing where's the customer stuff, because we want that whole team to have context. And the way I try to coach my teams is, "You're a Pluralsight experience person, right? You have a special skill in the fact that you, you know, you're a software developer or a data scientist or whatever you come from, but first and foremost, you're an experience person first. And how do you identify those needs and prioritize and shift value as what makes sense for that team. And through that, and this goes to, diversity in the team, and it's, I mean, (A) we pride ourselves on the sort of diversity that people think of when I think of diversity, but we're also thinking very heavily about diversity of thought too.


James: So, and I see this within my leadership team as well, is having people with different backgrounds from disparate disciplines brings a very healthy level of conflict to the team. So it's not, when people says conflict, it's not about being at an extreme conflict where it's everyone's yelling and it's horrible. It's about, "Hey, I have this idea." "I also have this other idea or this other way of executing on that idea. Have you considered this?" "No." Or, "Have you considered that?" "Yes." And then what comes out is a much stronger solution at the end of it, by ensuring that everybody has some depth of knowledge and feel for the customer problem in the first place, that conversation is rooted in the customer lens.


Erin: Correct me if I'm wrong, but do you have researchers on your teams or is a UX researcher, design researcher or product research, is that a function within the company or is that distributed across design? Yes.


James: So we, I call them, they're really full-stack designers, so they do the whole thing. So from idea research to implementation and I love it. Like I actually feel like there's a ... like what I've admired over engineering for a while is in the last 10, 15 years it's been ... You know, it used to be, "Hey, this guy's a Java engineer." But engineers don't like that, man. These are engineers. They'll work it out. And I'm seeing within Pluralsight, "I'm not a UX researcher. I'm not a UX designer. I'm not just one type of a conceptual designer. I am a full-stack designer. I can understand and build and create aa possibility all the way through that chain," which I love. I mean, I think that's what we all do, right?


James: And you know, I try to encourage that sort of thinking on everybody. Yeah. You might have more depth in an area. Please use the expertise of other people who do have more depth when you come up to a skills challenge, but don't be afraid of it. And there's one other element on that is we have a really well-defined design system. So it comes, you know, a central design system where everybody can pull assets from and icons and all the rest of it, so we don't have to have that continual conversation about what sort of design should we use. We all know, and then we can pull in and use the parts and put them into experiences as they make sense.


Erin: But so many folks being basically full-stack, either engineers or designers, some folks have more depth in certain areas and rely on each other, but how have you empowered different folks to conduct ... I know you do a variety of different methods in terms of qualitative usability testing, interviews, prototypes, the quant stuff, A/B testing, a variety of methods. How do you create and expand those skills across these functions and these teams?


James: Yeah, it's kinda, we do it, right? So we move what we do, which is a culture of learning. So we're trying to inspire that in other countries ... Other countries! Yeah, we are inspiring in other countries, but we're also inspiring in other companies. It's, how do you bring that together?


James: So, you know, we have the product, so guilds and things that can present what they've learned to other product people or anybody who's interested in. Same thing with our architects. Same thing with ... So we have these communities of practice that share best practice across. And as you know, bits of the company become good at certain areas, other teams just naturally seem to want to be able to use the same technology in their aspects as well, with what they're working on.


James: You know, A/B testing is a good example. We bring that competency, to Pluralsight, and there's been somewhat level of it but it's increasing over time, and all the teams are really trying to work out how to use A/B testing effectively. With the same guide rail, that A/B testing can lead to a lot of ... what do you call it? ... the optimization of the local maxima rather than looking at what the possibility is across the broad structure. But if we maintain our muscle and voice of the customer, understanding directly from the customer, and then also look at how that reacts in an A/B test, when we get to a customer and a customer confirmation testing process, then I think we're okay.


James: But we've married that T qualitative skill with the quantitative ability of A/B and we're using it in two or three bounded contexts to start off with, and then everybody will start picking that up. So it becomes a real ... Like, again, it's like why we set out really defined architecture in terms of people and systems, there's just a whole bunch of organic stuff that happens as well. That's perfectly fine. We totally encourage that, for the teams to help other teams out.


JH: Yeah. My latest pet peeve has been when people at a very abstract level, will like talk about what is a product manager versus a UX designer. And they'll do the Venn diagram of where they overlap and where they're different. And well, it depends on the team, right? If you have an autonomous team and you have a UX designer who hates facilitating interviews and the product manager loves it, then for that team, that's probably the arrangement. And if you have the opposite on another team, they probably do it that way.


JH: I always think of the metaphor that comes to mind is like basketball teams where there's going to be some general trends where the tall guy is going to stand near the hoop and probably rebound a lot and the shorter, faster guy's going to dribble and shoot more. But if the tall guy's open, and wants to shoot a jump shot, and he's good at it, then that team should do that. Right? And I think that's a big benefit of the autonomous teams is you don't have to spend all this time haggling over these crisp and rigid role definitions. You can just actually let the teams work together and based on different strengths and interests, that team can operate a little different than the other team. And that's probably better for everyone. They probably get better results. They're probably happier.


James: Yeah, absolutely. I mean we really held psychological safety to be a big part of that. And that means being able to have your own ideas and be out there and be able to challenge each other in a really respectful way. And we do a lot of work on team health.


James: So from actually to your point, JH, we've had got a great functioning team at the moment that two of them really love to battle things out and argue over things, like on a white board and in a respectful but animated way. And there's other two people on that team that don't like being involved in that. And what they do is the two that love to argue it out go into a whole room, argue it out, come out with a solution and then present that to the rest of the team. And the rest of the team, then they provide their conflict to it or whatever their opinions are to it. But I can't believe that level of self-awareness and team health. So that's part of our management structure is to be able to help teams self-diagnose who's good at what and what style works for whom, and be able to mix and match and put in features like that, to be able to navigate through that.


Erin: Autonomy requires a lot of trust, as you've talked about. And working together in small teams to make things happen requires a lot of interpersonal skills, whether that be through people locking themselves to whiteboard it out and ignoring or whatever that might look like in different teams. Is there a particular profile or a set of traits that you hire for, to make some of those things possible?


James: Yeah, definitely. And we're very, very prescriptive on this. It's the, we do not hire for cultural fit. Like we find that if you hire for cultural fit, you get all the people that are the same and that just destroys our idea of diversity of thought. When we hire on a values basis, we're looking for people who can live up to the core values that we strive for. You know, you can live up to those core values in many, many different ways and we actually really respect and want people to have as many different ways of living in that core value as they possibly can. So, that's how we hire. It's not about cultural fit. It's not whether they'll fit in here. It's whether people can hold those values to be true. So, that makes a lot of different approaches to how that works.


James: And we're also looking for when candidates are coming in here, we're looking for, are they inspired by the mission? 'Cause, if they're not really inspired by democratizing technology skills, then you know, there's plenty of other places out there that are hiring right now. They might not be a great fit for us. So then it's the things that you just admire in most candidates. Are they self-started? Can they communicate well? And that doesn't mean that they have to have a huge grasp of English. It means can they sort of talk and understand from an emotional level technology, technical view, with their fellow team. And then we go into skills like team skills.


James: So we're actually looking at ... Yeah, it's often easy to see a superstar within a team and say, "Oh, that person's amazing." What we really value as well is the people who may not have the best skills and may not be the loudest in the room, but when you put them on a team, that team seems to do better. So how do we value that? How do we identify those team-builders who just seem to make everything go so much smoother. Because those people are incredibly valuable as well.


James: So if we can identify and celebrate them, that helps the whole ... That raises all of us. Those people multiply the impact of all the people who are acknowledged superstars. It's those who are really good at making that team work.


JH: It seems like this is easy to believe in and go with when it's going well. Are there any like stories in the company's history or like lore, a time where this blew up but you stuck with it anyways or you coming into it as a new person and maybe being at first a little like feeling it out and maybe seeing a team make a misstep or something like that? Do you guys maybe hold those up as a learning example, or I assume like, you know, somebody shipped a bug or something has happened, right?, that didn't go perfectly to plan. How do you guys use that to make it maybe more resilient instead of like changing course?


James: Yeah. So I think the biggest problems occur when teams don't ship, when for one reason or another they're not producing stuff. So a bug is one thing and we can pull that back in pretty quickly. So it doesn't really ... Again, 'cause it's architecture, 'cause of the release schedule, all the rest of it can metrics. We're not so afraid of that. It's when the teams don't ship that they seem to have their breakdowns for some reason or the emotional intelligence becomes a lot harder, and we get out of that healthy conflict zone and into non-healthy conflict zones. That's when I've seen problems, and when a team isn't shipping, that's when I'm looking at my leadership team and myself to say, "Hey, what's the hold-up?"


James: It's not about pushing anything. It's about, there has to be some unit of learning that we can understand from, from building it in production at some point. Usually that can come down to a confidence level or people don't appreciate that we get, like from a leadership viewpoint, we're totally behind them. And, if there is a mistake or they screw up, we have trust in our process to understand that that will happen pretty early in the system. And you know, also to the credit of the services, we don't have many single points of failure. So there really is a good resilience in the process. But, yeah, things happen and it's just like anybody else and we react. And the other thing is that when something happens in an area, that team reacts on it pretty quickly 'cause they feel a real level of responsibility again to make it work.


James: And it's not like there's some other QA team or problem-management team. Yeah, DevOps has, it's all a notification process, but once that thing's identified and we need to fix it, that team gets on it. And I don't think it's ever been me asking them to. They jump on it and they're doing it before I even know about it half the time or other people do it. Then again, it's pretty rare because of the way we've set it up. So, yeah, and at the end of the day, we try and get it right, build on that process and get the thing originally, or work at how to make it better and more reliable, and work out how we can remove single points of failure as they arise.


Erin: Very good. Anything we didn't ask that you want to talk about?


James: Yeah, sure. Well, we're hiring here in Boston and Utah, and we want to look for the right people who were inspired by that message, and always interested to talk, and we have plenty of made-ups and ways to get Pluralsight better. So, that's it. And I feel like I might've oversold it or anything, but you know, it feels like a really special place.


JH: Cool. I'm going to just add User Interviews' also hiring, since get my own plug in while we're at it.


James: Awesome.


Erin: Thanks for listening to Awkward Silences, brought to you by User Interviews.


JH: Theme music by Fragile Gang.


Erin: Editing and sound production by Carrie Boyd.

Creators and Guests

Erin May
Host
Erin May
Senior VP of Marketing & Growth at User Interviews
John-Henry Forster
Host
John-Henry Forster
Former SVP of Product at User Interviews and long-time co-host (now at Skedda)
James Aylward
Guest
James Aylward
James Aylward is GM and Chief Product Officer at PerkSpot (at the time of our interview, James was SVP and Head of Data Product at Pluralsight). He’s passionate about identifying customer needs and rapidly designing and developing products that offer solutions. He has 10 years of experience managing products and nearly 20 years of experience in the field.