The Product Manager

How to Balance Speed and Value: The Essential Role of Empathy in the AI Product Era (with Megan O'Rourke, Executive Director of Product at Metalab)

Hannah Clark - The Product Manager

The Tin Man’s silk heart in The Wizard of Oz is a perfect metaphor for AI right now—it can convincingly mimic empathy, but it’s not the real thing. In this episode, Hannah Clark sits down with Megan O’Rourke, Executive Director of Product at Metalab, to explore the delicate balance between AI-powered productivity and human-driven resonance. They dive into why empathy and storytelling are irreplaceable in both product development and leadership, and the risks we run if we trade them away for efficiency.

Megan shares stories from the field, frameworks for deciding when AI belongs in the workflow, and practical ways leaders can embed human connection into their teams’ processes. From reading the silence in a meeting to designing for emotional moments, this conversation is all about building products—and cultures—that people can truly connect with.

Resources from this episode:

Hannah Clark:

Whether or not you personally participate in the Wicked movie hype train, I'm reasonably sure you're familiar with The Wizard of Oz. In the story, the character of the Tin Man is portrayed as a humanoid robot who wants nothing more than to have a heart. But when he finally receives one, what he gets is actually an imitation of a heart, a silk bag of sawdust that convinces the Tin Man that he has the empathy and emotional depth of a human. If you haven't figured out where I'm going with this, the Tin Man makes for a pretty intriguing allegory for where we're at with AI. LLMs have reached a level of advancement in which they can pretty convincingly mimic things like empathy and deep human understanding, and that presents dual effects. On one hand, astonishing opportunity. But on the other, hidden traps that can slowly melt away your impact. My guest today is Megan O'Rourke, Executive Director of Product at Metalab, an end-to-end product design agency. Megan's role sits at the intersection of design, development, research and people management, and so she's keenly aware of the connection between storytelling and user value. And perhaps more importantly, what's really at stake if we override empathy for efficiency, both in product development and in leadership. We discussed the balancing act between AI powered productivity and human-driven resonance, why storytelling is as critical as data, and the question that every product leader should be asking constantly—if they have the courage. Let's jump in. Oh, by the way, we hold conversations like this every week. So, if this sounds interesting to you, why not subscribe? Okay, now let's jump in. Welcome back to the Product Manager podcast. I'm here today with Megan O'Rourke. She's the Executive Director of Product at Metalab. Megan, how are you doing today?

Megan O'Rourke:

I'm doing so well. Excited to be here.

Hannah Clark:

Me too. So can you tell us a little bit about your background and how you got to where you are today at Metalab?

Megan O'Rourke:

Absolutely. I got my start in advertising. Crispin Porter + Bogusky, this big agency in the States known for creating very bold very destructive work, very culture shaping, sort of advertising. So I was part of a team that managed the creative process, reviews and production, and through that it really instilled in me and appreciation for detail. For high craft and also just the exposure to working with world class creatives. Even though I'm, I didn't consider myself a creative, just loved being part of the creative process. So loved advertising, but I found that over time I was yearning for something just a little bit different. Not these sort of one and done campaigns, but rather something that felt a little bit more living and breathing so you could build and evolve and ultimately respond to a need and a desire. That shift ultimately led me to Metalab, which is where I've been since 2016. So coming up on 10 years now. It's a long time in agency land.

Hannah Clark:

In any job now. Yeah.

Megan O'Rourke:

In any job. In any job. But I mean, yeah, it really speaks volumes to the type of work that we do and the people that we work with that's keeping work really interesting. And I'm happy to be there. Yeah. Metalab is a, an end-to-end product, very design led agency. So we partner with companies. Ranging from early stage startups to Fortune 500. Some clients come to us to go full Z to one, which is always fun and challenging, whereas others come to us to maybe a partial or a full redesign or overhaul of an existing product. So we're very design led. At our core, we're very focused on craft and outcomes. We're not about vaporware. We don't have big strategy decks. We are focused on just building really great products and making products that matter. So my role today. I'm the executive director of product management at Delam, so that means I, I lead our PM function. I am involved in some of our higher profile projects as well, and these days I am thinking a lot about how we build happy teams and teams that build great products.

Hannah Clark:

Yeah. Well, I'm so excited to talk to you because all of these experiences really funnel well into our core topic today, which is. All about how product teams can leverage storytelling and empathy in this age of AI, such a chaotic, full of gray area time. So in the spirit of storytelling, I'd love if you could share an anecdote that sort of puts in perspective, you know, what do we have to gain and maybe also what's at stake if we don't prioritize these very human elements of storytelling in our organizations and our products.

Megan O'Rourke:

I'll share an example that is more on the personal side, like it's not a flashy product. It's not a big strategic decision, it's just this very like human moment and it's about someone on my team who is this really smart, really engaged individual. They're always super vocal in meetings. They bring ideas to the table, and over the course of a couple weeks, I just started to notice like they were going a little bit quiet. They weren't speaking up as much. They seemed a little withdrawn and there was nothing dramatic. There was no like performance concerns, but there was just this shift that I could kind of feel. And I thought in this moment I was like, if I was running completely on automation, if I was using AI to attend meetings for me and just tracking along on things that needed to be done or dashboards of work completed, I probably would've missed it. But I was there, I was paying attention, I was reading the nuances, and I was able to see a bit of a difference and just reach out to them and say Hey, how you doing? Like nothing formal, but just a human connection or reach out. And through conversation I realized that, you know, there was something deeper going on. They were feeling really overwhelmed. They were not really sure of kind of their place in this complex and fast moving product and they just kind of felt invisible. And it just, it was this like aha moment for me because I thought AI is so great at so many things and it's not great at picking up at the absence of information or the quiet moments. Like it didn't catch that hesitation of that person just. Muting or unmuting and then choosing to mute again, like it doesn't say to us something's off. And I feel like that's what we notice as human. So small example there. But I think what it represents for me is in leading with the human approach and with empathy, what we have to gain here is trust. And I look at that is that's really the foundation for how we collaborate, how we support each other. In this example, how we give feedback. Those are so core to how we work. And. To answer your question of what's at stake here? If we overly on AI in these moments and we let it, you know, stand in for us being present and attentive and having intuition, then we're losing out on those signals and. When individuals on the team feel like people aren't paying attention or people aren't caring, that's a really quick way to get to lack of engagement or feeling withdrawn. And it would be awful to lose great people, not because their work doesn't matter, but because they don't really feel seen.

Hannah Clark:

Yeah. Okay. Multiple reactions I have to that because I feel like this really hits. One thing that kind of comes to mind, first of all is I feel like what you're touching on is kind of a reckoning that a lot of us are sort of facing about, you know, what is the role of work and what is the role of a company in our culture, in our society. If we are trying to automate a way the things that make these things meaningful to us I think it's a really important reminder that there are things that, you know, we could choose to automate, but we still have to be selective about. What do we stand to lose by removing ourselves sometimes from those like really essential human interactions that create culture and that nurture and foster culture in an organization and make people wanna stay for nine years or 10 years or more. The other thing that I think was really poignant that you said was about this idea of what is not being said, the idea of storytelling being a lot about what is not revealed. I had a very acute moment of that recently. There's a fringe festival that just completed in my area and in my non-career time. I am a performer and a lot of the storytelling that's happening there, you know, I've been kind of observing it more with the idea of maybe doing my own show you know, a non-professional show, but something that has really kind of come to the surface when I see really good storytellers. And I think this applies to, you know, even in a product or professional context. The really good storytellers are able to show a moments and a narrative by being selective about what they include and what they omit. And so like in some cases you'll see a story that kind of takes you on a whole journey over several years, or can be, you know, through the course of a sales funnel or whatever, but the moments that you choose to include and also mean being selective about what moments. You don't include and what moments you have to pay attention are absent from like a certain narrative. So I think it's kind of interesting that you're bringing this up now because this is something that's really been on my mind is like storytelling is as much about what we choose not to say. You're right. Like this is something that as people, we have such a unique ability to derive meaning from certain things that aren't explicit and also derive meanings from silences that are very subtle. Anyway, it's a bit of a tangent, but I'm liking where this is going.

Megan O'Rourke:

I love that and I think that's, we'll probably get into this later, but like I think that's the important element of bringing the human purpose. There's so much information, but it's the discretion and it's the intuition and it's experience of understanding what makes the cut and what's going to create that compelling story when all of the data can be presented to you, but what are you using? What are you excluding and how are you packaging it up? That's the nuance that I really see as the benefit coming from the human side of things. I don't want. AI could make that call for me.

Hannah Clark:

I agree. I think it, I don't want AI necessarily make that call. And I think that there's a really important stage of this. I mean, we, when we think about a sales funnel for example, you know, we think of the awareness stage and that's like a lot of, that maybe is suitable for being generated. You know, there's a lot of awareness stage. Kinda stuff that we can do that makes sense to kind of automate. But that nurture part of the sales funnel I think is that is an area where we have such a huge opportunity to be able to really infuse very deeply personal experiences, foster an emotional connection, leverage community. Like I feel like there is such a, an opportunity there. And I think that the hunger for really compelling, human elements is going to only increase now that there is such a volume of generated content.

Megan O'Rourke:

Oh, I agree. Hannah. I think we're very much aligned there. That will become increasingly the differentiator.

Hannah Clark:

Yeah. And, okay. When we talk about differentiators in product design, like what do you look for in practice? Like when you're, you know, developing a narrative or like working with a client at Metalab? What are some of the more specific strategies or exercises that you might use with your team to create conditions for like empathetic design?

Megan O'Rourke:

Okay, so I'll start by saying like humanity and empathy. I think that shows up not just in the what we build, but the how we build this. My team's, I have to hear you say this all the time. I'm like, it's not just what we build, it's how we do. It's the experience of working together. It's the experience of how we listen, how we challenge, how we're creating space for the honest and the real conversation. I'll share one specific exercise that we love to use just to help build empathy for our end customers. And then also just one more philosophical or kind of overarching strategic approach for how we create those conditions. So the first is the exercise. So you're probably familiar with this as there are many PMs, but it's going back to the basics of experience mapping. So this is something that we do with. Most of our new clients on a new project, and it's a simple exercise, but it's so powerful because you, it puts you in the shoes of the customer or the target audience, and you have to step through the journey with them in mind. And you pause moment by moment, and we ask at each stage, what is the user thinking? What are they doing, what are they feeling? And we're identifying what are those friction points and what are the emotional highs and lows. This is important that we do this with our clients, even those who know, you know, the entire product lifecycle inside and out. It's an empathy building exercise because it grounds us in ultimately a human problem before we just jump into solutions, which is easy and fun to do. So we find that's a really helpful grounding exercise, not only for our team who might be ramping into a new product or a new industry, but also just to help align stakeholders and ground them in. The things that matter, and that's people and the problems that we're trying to solve. So that's the first one. And then the second is more of just like an overarching approach for how we crit work. So embracing critique and how we make decisions as a team. And like many organizations, I'm sure you know, running regular design and engineering crits, but approaching it with a very like empathetic and human lens. So rather than opening up a rubric and checking boxes and saying, does this look good? Does this meet the requirements? Is this to spec? The first questions that we ask have to be grounded in the user and their needs. So we ask ourselves like, is this meaningfully solving the user's problem? Is this intuitive? Is this clear? Can it be understood without explanation, without the designer voicing over the work? How does this make me guys feel? And yes, does this move the business forward? Ultimately, is it creating value for the user? And so even just making sure that the first lines of questioning when we're critiquing work, start with the user is a really important part of how we approach critiquing collaboration. And ultimately that comes down to a very empathetic mindset and it's very intentional with how we work and also with how we hire at Metalab. Empathy is just a baseline. Like we require that cross-functionally. So our designers, of course, engineers, yes, PMs, but like everyone needs to have an empathetic mindset. And the last thing I'll add to that is, you know, when you're creating the conditions for folks to feel like they're advocating for the user, it's not bringing like personal perspectives to the table. They're a bit more objective. And sometimes that means bringing ideas that are outta left field or out of scope. Oh, no. It's really important in those moments to not shut things down, but instead just to stay curious and to stay open. That's just really a core part of our culture in terms of like how we're challenging each other in service of solving the right problem or the right person in the right way.

Hannah Clark:

Okay. I like what you brought up here about the way of working and obviously, you know, keeping human elements central, I think is a through line that we'll have through this whole chat. I wanna talk a little bit about expanding on what we sort of talked about with regards to where human. Interference and connection is critical and where AI really is, you know, the solution to step in, like kind of where, when do we kind of make some of those decisions on what is for AI and what is for people. When you think about your team and how things are done and led at Metalab, how do you kind of make the call between what is a task to outsource to a robot and what do you keep in the human realm?

Megan O'Rourke:

Not to totally reframe the question, but. I think it's rarely one or the other. I think it's often about knowing when we leverage AI and when we lean on that insight to kind of go deeper there, I can share an example within the context of a project that we've worked on and when we chose to index one direction versus the other, so we were working with a big, global athletic manufacturer, we were helping to improve their internal tool ecosystem. So think about things like production workflow throughout production lifecycle, like physical production lifecycle. And the company had done a ton of research, but what they were really missing was a bit of clarity and a through line, and really like an emotional hook that could really convince people to go in, all in on executing on this initiative. We started by leveraging AI'cause we needed to make sense of mountains of research and needed to cluster feedback. But what it didn't do was like really hone in on the feeling for the end users of these products. And ultimately that's what mattered to our stakeholders. Like they already had the business case, but they needed to understand why all these broken tools were causing as much frustration as they were. We parked a lot of that AI driven work and we got in the field, so we spent weeks on site with dozens of user groups to like really understand what was working, what was not. And if you've ever had the pleasure of sitting next to someone who is struggling to do their work, kind of like the work is the workaround, as we say in these scenarios, but it just builds such a deep empathy and understanding and it allowed us to get way, way more meaningful content. We were ultimately able to use for storytelling purposes. For example, one, we got like really great sound bites and you can just hear the frustration in people's voices just lamenting the challenges that they were facing. Capturing videos, you could see people working through the workarounds for their day-to-day work. And yeah, just those moments of like insight and empathy that come from sitting down next to someone who's working through. A problem that they've normalized. And when you're just sitting with that, like you just can't untangle yourself from it. And ultimately those became like core pillars for how we were telling the story back to our stakeholder group. So considering high emotional weight of that problem, these are people stay to day tasks and work this how they feel about their job. And then considering those kind of storytelling needs of the stakeholders and bringing them along in a way that could. Make them feel like they really understood the impact of the challenges that their teams were facing through those lived experiences. So it's a long way of saying it's rarely either or for us, it's often about kind of finding those right moments where it can be leveraged, and then also staying open to those limitations of being open and critical to pivoting back to those more human led approaches.

Hannah Clark:

That's a beautiful answer. There's so much in there. I appreciate the flexible approach because I can resonate with this idea of the beauty of being able to be in a field research position and be able to really connect personally with people who your product impacts. There's really no substitute for being able to get really up close to, you know, well, how is this person experiencing this? Because to just parse through data and just get insights, you know, that are broad strokes, they're very disconnected from the actual impact that you know, that you would see if you're on the user. Side of the screen. So yeah, I think that's a really lovely way to look at it and kind of what I'm getting from that to sort of summarize that back to you is being flexible enough to kind of lean on AI to sort of parse and organize and make things digestible. But to lean more on humans, to make more of an interpretive role and kind of play more of a run interference a little bit for what is the real impact? What am I missing by not understanding the user's pain point intimately that can maybe force me or cause me to overlook some of the data, the way that it's presented in kind of like a parsed format.

Megan O'Rourke:

Totally. Well, and I think yeah, if you were to just look at these like large swaths of data, you're missing out on the nuances that come from those like of soundbites we even found in leveraging AI for some of that infield research summaries oh, quotes were misattributed and the emotions were like really flattened and it just didn't have that same resonance. So I, I feel like it's embracing that moment to say, okay, no, like we need to go we were able to use AI to get us here, but what's that differentiation? What's that compelling story point? Like we were saying earlier, that nuance of understanding like, okay, how am I gonna get this across the finish line in a compelling and human way and build upon the things that AI is able to accelerate for us, is going to continue to be the differentiator there.

Hannah Clark:

My experience with that that calls to mind, I used to work, I was also in a marketing background, and we used to have a, an agricultural client. I promise. This connects to digital product as well. We used to really rely with our marketing strategy on just straight up results. Like we would just talk about, you know, what's the results you can expect? What is the average result that the average customer gets in your area? And just really kind of lean on a very data first, very flattened as you'd say, approach to how we're going to kind of sell the, you know, the results or what you could expect from this product. I had an amazing opportunity to actually go to farms where the product was being used and interview families and communities that were actually hands-on, like using the products and, okay, this might be like a little bit way more human than the average user you know, depending on the solution that you're behind. But I had an opportunity to spend some time with a Hutterite colony. I don't know if you're familiar with Hu Writes. There's like parallels to be drawn with Amish communities in the United States, but basically they're very live off the land. They're very handmade. Everything is made from scratch, this very low technology usage community, and it was amazing to be able to see the real impact that a product would have those results. Little decimal points that we would think of in a data-driven, like a purely data-driven campaign. Seeing those, what those actual decimal points mean in a real life context completely changes how you think about what that means as a marketer, as a product person, as whoever is. Behind that solution and you, it really reframes how you empathize and therefore develop and, you know, strategize around the needs of those people because they're no longer just, you know, like customers that are buying and getting these results. They're people who are really impacted day to day by what you're offering them. So I, yeah, I mean, I don't think everybody is so fortunate to be able to get that kind of a hands-on experience. I do think that there's a lot of value in finding ways to get closer and spend time. With users to be able to kind of see that journey from their point of view and like you just described, like to really be able to see that journey and like where those friction points really impact like so many people around them. Anyway, there's a little story time for me.

Megan O'Rourke:

No I love that and I, here's what's so great about that is when you are reflecting upon that experience, I can hear the care in your voice and I can hear the impact that has continued to have on you. That is something AI can't do. AI is not going to care. It's not going to care deeply. It does not have a heart. And sure there's, you know, the memory that you can look back in time and say, what is that? But it's not going to have that same care and conviction that you've just shared. I would argue that I think it's actually more important than ever, especially in a remote culture. Many of us work in distributed teams. We aren't on the ground day to day with our end customers. We prioritize those in-person moments and connection because we're having less and less than that. So sure, it might cost more money and take more time. You have to make a business case to go spend a week on site with farmers who are using these products every day, but that is such an incredibly valuable use of time because you're carrying that experience then in your heart and you are caring deeply. I think it's money well spent, money and time well spent.

Hannah Clark:

I agree. I think that this is an era that's really highlighting the difference between raw productivity and being really customer centric and results focused. And I think being results focused in this context really means, you know, is what I'm doing. You know, I can put out things really quickly, I can accelerate cycles, I can, you know, do all these things, but if it doesn't truly connect with who I'm trying to reach, and if it doesn't really solve people's problems at the root cause. That productivity is, it's cheap.

Megan O'Rourke:

But is it valuable?

Hannah Clark:

Exactly. Exactly. Yeah. Okay, so let's get tactical then. So if you are working with clients who are really eager to integrate AI into everything, what would be your framework for determining, you know, like when do you recommend leveraging AI tools in, say, like a client's product versus a human led process?

Megan O'Rourke:

We tend to ask ourselves three questions, but before I reveal what those are, I'll just harken back to the fact Metalab's been around for about 20 years and we've seen a lot of technology waves. So there's always hype. There's things that take over, you know, it's Web3, oh, it's mobile first design. Now everything's AI and there's conversations that we have regularly where we are reminding ourselves the technology itself isn't. What matters most? Like where are we driving toward positive outcomes? And I'll attribute this quote to our chief design officer, Sarah Vienna, total badass. She has this great question, this provocation that super direct, and it's very constructive, and she asks, does this product even need to exist, or this feature or this element? And really the question there is are you building something because it solves a real problem for real people with a real need or because it's trending and it's something that can just be leveraged. The second is AI really the right tool or the job? So is it making the experience? Faster, great, smarter, let's hope, impactful. It needs to be or are we just asking it to do something that it's not good at, which might be being empathetic or building trust. And then the third is, what is that risk if it gets it wrong? So low stakes output, like content tagging, you're like, yeah, sure, like low risk. But if it's something that ultimately affects. Accuracy or someone's sense of being understood or even just like being dismissed. If they're not happy with the AI output, then I think that deserves just a line of questioning and validation of does this really need to be leveraged? So I think that's ultimately why the grounding question of does this need to exist and why? Reminds us that AI should never just be the reason something gets built. It should be used to empower and enable work to work better.

Hannah Clark:

I really like that, and I think that is a very crucial. Approach and line of questioning that can easily kind of get lost in the fold. I had a conversation ages ago, this is you know, one of those addendums of you know, additional resources, but a long time ago we had a conversation with Samantha Gonzalez who is she was also working, I'm not sure if she's still there, but she was working at a firm that was very specialized in ethical product design. And that was a big layer as well in their process was the way that they kind of framed it was. Like, she tends to use a lot of improv exercises in her process with clients. And her thing was, what is your product's evil twin? And it was kind of like a way to kind of look at some of the ethical, weak spots or areas that kind of oversight or Yeah, like kind of look at does this need to exist? Is this a good thing? You know, like just kind of a different way of framing. So this is kind of reminiscent of the process that she used, and I think that it's like that conversation happened well before AI was even a chat. Now we're talking about this in you know, there's so much excitement about this technology that I feel like sometimes we get a little bit into the hype and don't really think about some of the ramifications that some of these things have that are easy to overlook in the excitement and drive towards the solution. So I think that I, this is a process I hope everybody starts to adopt as a matter of course.

Megan O'Rourke:

You've probably experienced this in using other products, and you can take or leave this example, and I won't call them out by name, but I think about a product I have used for the better part of a decade. I love this product. I'm a diehard user. And it rolled out an AI feature that I think really missed the mark and it's a fitness tracking app. So I like uploaded a hike from this weekend. I got this nice little AI summary that was clearly meant to be a pat on the back. Oh, great. New hike you've never done before great elevation. Like all this stuff would've been great if it was true. None of that was accurate and I felt this sounds dramatic. So like disappointed in this product and the rollout of that, because I'm like a diehard user of this app and I just, I saw content that wasn't accurate, so it felt really generic. It felt off base. I'm like, do I trust this app anymore? Or it's just gonna make these broad strokes. And so I think that's just a small, like personal example, but one risk of what's at stake. When you have diehard users who aren't resonating with like your AI feature, credibility erodes. Now, I'm still gonna use the app, but I just wanna dismiss that little AI summary now. Like I don't need it. It doesn't actually provide any value and it does not pass muster of does this actually need to exist? Does this solve a need or is this providing user benefit? No, actually quite the contrary.

Hannah Clark:

Yeah, I have had a similar reaction, I'm sure to products that I've used and, but others that I've kind of seen emerge where it's kind of like a different area of skepticism in which I feel like the potential for abuse of the technology that is being developed is like so clear to those who aren't working for the product. Okay. So I'm not gonna name names also, and you know, the solution basically that I saw was. Being able to use a sample of someone's voice, a very brief sample of someone's voice to create and generate audio content similar to what we're doing right now. And I can see the desire to help creators scale and to be able to produce more content and put out more value faster and with less. Time spent having conversations as if that's a waste of time. I think this is the best part of my job but anyway, I can see the value proposition and yet I kind of feel like, I mean, is it just me or does the potential for abuse of something like this maybe overshadow the value in terms of productivity and is that a good thing, that we're not having real conversations and trying to outsource them? These are kind of the questions that I'm asking, and this is kind of what I mean about the hype kind of overshadowing or causing people to kind of lose that layer of, I wanna say common sense because, you know, I, I don't know if, I think it's a good thing that we are able to take a small sample of someone's voice and replicate that however we see fit.

Megan O'Rourke:

Does it need to exist?

Hannah Clark:

Does it need to exist? Yeah.

Megan O'Rourke:

And importantly, it's does it deserve to exist? Yeah. Is there a fundamental why and what is the trade off there? Is that robbing us from the full conversation that you and I both love?

Hannah Clark:

Yeah. Well, the full conversation.

Megan O'Rourke:

It does not deserve.

Hannah Clark:

Exactly. Well, and we're I feel like right now we're in a time and space where, I mean, not to be dramatic, but like the products that we build now are truly intangibly shaping the world that we live in. And it's like the product decisions that we make now aren't just about what kind of value we are and aren't creating for stakeholders. It's also about what is the value and what is the kind of culture that we're kind of laying the groundwork for generations to come and ourselves, even in the kind of the near future. You know, we have like much bigger questions to ask ourselves now that we've been bestowed with this, like pretty significant power. Anyway, this is getting to philosophical territory but yeah, I think that this is like this layer of questioning of does this deserve to exist? Is AI and the answer here, or are we maybe taking a route that is better left in the bin? Yeah. Let's talk about storytelling in product design. So the introduction of AI. You know, we have obviously changed a way that we approach narrative and storytelling. I'm curious about how do you kind of use human storytelling in the process of developing a new, like you, you mentioned you guys do some z to one kind of development. What does that look like in process? Like, how do you kind of distill human storytelling into that early development process of a product?

Megan O'Rourke:

And I think this hearkens back to the earlier part of this conversation. So look at us connecting that thread. So I think at our core, like storytelling has always been a part of our design process where we've always been of the mindset of no decks, you know, a hundred page decks, like show prototype, brain experience to life. And so I think AI has certainly accelerated certain parts of our work. Like it allows us to synthesize things faster or spot patterns quicker, right? Series quicker. But that element and that nuance and that discretion. Like what and how and when I think is rooted in that human empathy. So at Metalab we have broad range of clients. So how we work with that two person founder team looks very different to our partnership with our enterprise organization. So for example, with our two person founder team. Do you think they wanna sit through a big deck? No. No. They want us to open up Figma and just walk through the prototype and just kind of lightweight narrative and discuss as we go and unpack. And sometimes we're riffing live because that's what works well for them. And it's an understanding through stakeholder interviews and just the way in which their organization. Run. You know, it doesn't require all that perfectly curated storytelling. Whereas other partners of ours, product teams who might sit within large enterprise organizations who might be bringing along wider stakeholder teams, there's a lot of care and crafting that goes into those presentations and that storytelling and creating super defensible positions and rationale for every decision made. And it's that understanding through yeah, experience, but also just intuition and empathy and at that, because in an agency we have the pleasure of working with so many different clients in so many different stages. It's really that tailoring of the process and the approach that I think is really sort of human led. So we'll have a baseline process, but we rely on the project team caring deeply and asking the critical thinking questions what is going to be best suited for. This audience for this stakeholder need and the needs of what we're hoping to achieve. So I feel like it's human empathy is what's driving the tailoring to like the how, what and the when.

Hannah Clark:

Cool. So to kind of dig deeper on that, do you have a specific example that you could share of this kind of empathetic approach and how it's kind of uncovered, like an insight or a solution like just through that kind of workshopping.

Megan O'Rourke:

For sure. We've got this one very cool client called Tally Health. They're a longevity focused company and they focus on at home testing, like biological age testing and health recommendations. So got a lot of people who are super passionate about health, which is very cool. And anyway, they had come to us with, again, a lot of existing research and they had a pretty strong conviction in knowing what problems they were solving and who they were solving for. So they were like, okay, let's get straight to design. And our team kept feeling like there's kind of this like missing layer. And so objectively, like we had all the data, but we were kind of missing this like behavioral, why I would've context there. So we got to work with some early research and prototyping and started simulating what the delivery of results would look like. So imagine that you're sending your test, you get your results back. This can be like a pretty emotional experience and sometimes like a fairly confronting one to your biological age. And yeah, what we learned was that people who got younger than their biological age, they felt really affirmed. You know, they were like, I've made the right life choices and health choices. And then the folks who were older than their biological age were like range of emotions, like fear, shame, sometimes denial. And what this difference was here was like the mindset, not just the number. And so like even people who had received quote unquote, negative or difficult results were motivated to change. And so anyway, all of this to say the data was there, but the behavior needed to be understood. And it turned out to be like a really impactful insight that drove a major feature within the product, which was creating these personalized health plan. So interestingly, that wasn't on the original roadmap. Once we saw what was in the research, it became a real cornerstone of experience and it's actually like a main retention driver within the product. That was just an example of had we just followed the data, we would've missed that nuance and it was empathy. It was that human approach that really revealed that insight, and then our design team could bring it into something really like meaningful and impactful within the product.

Hannah Clark:

That's amazing and a very succinct like example. And I like, I feel like that's such a huge thing to discover too, because that opens like whole avenues for a product portfolio around like next steps and how the product can mature. So yeah, what a huge thing to kind of discover through that like very empathetic empathy led process. I think that's an amazing example. I know we've kind of taken up a lot of your time here, soapboxing and kind of get going all over the map.

Megan O'Rourke:

No we're passionate. We're passionate about this subject.

Hannah Clark:

Yes, exactly. Yes. That's great. I'm glad that's clear. I do want to kind of come back to the product leadership aspect because you know, we've got a lot of product leaders who listen and I know that all of us are kind of grappling with this idea of how to maintain this human touch and kind of strike this balance of AI powered productivity and making sure that we're still very much focused on the human components of our organizations. So what are three actionable steps in your view, that you would recommend implementing in teams starting next week to product leaders listening?

Megan O'Rourke:

Yeah. These are simple ones, these are things you can try, doesn't require a lot of buy-in, you can just get on with it. The first is sharing a story, not just statistics. You know, we've talked a little bit about like the importance of bringing that human voice in, bringing a soundbite, bringing a quote, bringing a raw clip into your next print or team review, and let that speak for itself. And it's really important to let your broader team sit and feel that problem rather than just see cliff swaths of data. So sharing a story, two, this is again a kind of a personal one for me, but creating one moment each week. To check in or give feedback. So I practice a little something called feedback Fridays. You do it with the cadence that works for you, but it doesn't have to be formal, but just like a quick check in with a team member or a small note of appreciation for team members goes a long way. So AI can help track standups and whatnot, but it's that recognition and that outreach like that comes from you. So definitely that's an easy one. And it feels good. It feels good, especially on a Friday to wrap up a week like that. And then third is that empathy. I'm gonna call it empathy rubric. So for those critiques, creating, you know, what are your go-to list of questions that you wanna bring to your product reviews. Your crits, maybe informing roadmap decisions that are rooted in those user-focused questions. So does this meaningfully solve the user's problem? Does this deserve to exist? And how might someone feel in this moment? And really driving that line of questioning that shifts. Ultimately the conversation from, did we build it or did we build it right to, are we building on the right thing for the right reason? So it's really not about not using AI, it's about using it with intention and using AI to help create capacity for the things that I think matter most, which are really creating moments of human connection.

Hannah Clark:

So elegantly said. Oh, Megan, thank you so much for joining me today. This has been such a great chat. It's been all over the map. It's cathartic, it's been empathetic, it's been beautiful. Where can folks keep the conversation going with you online?

Megan O'Rourke:

Yeah, so definitely encourage folks to follow metalab.com. It's our full portfolio, but you can also follow us on LinkedIn or Instagram just to keep up to date on new work and portfolio updates. And then for me, you can find me on LinkedIn. Happy to connect with folks.

Hannah Clark:

Wonderful. Well, it's been wonderful connecting with you, and thank you so much for making the time. Thanks for listening in. For more great insights, how-to guides and tool reviews, subscribe to our newsletter at theproductmanager.com/subscribe. You can hear more conversations like this by subscribing to the Product Manager wherever you get your podcasts.