Episode 177: The Evolution of User Research: A Conversation with Steve Portigal, Author of Interviewing Users

Steve Portigal is an experienced user researcher who works with organizations to uncover insights about users, improve their research practice, and aid them in their product development. Ten years ago he first published “Interviewing Users”, and he has recently released a second, updated edition of the book.

It was eye-opening to have him on the Product Thinking podcast to find out why he felt this update was a much-needed response to the changing ways in which research is and can be conducted

Read on to discover some of the key points made by Steve in our chat.

You’ll hear them talk about:

  • 05:13 - When Steve wrote the first edition of his book, Interviewing Users, his advice and experience was centered exclusively around the principles of in-person research. The world of work and meetings has changed a lot in the ten years since that edition, and now remote interviews are far more common, especially since the pandemic. Looking to the positives, Steve comments that geographical diversity is now far more possible when conducting research. In the past, you would target a certain place where you’re going to be able to call on a lot of different participants, namely urban metropolises rather than more rural areas. Doing research remotely allows you to look further afield more easily and create a more diverse cast of participants. Though Steve admits that, on the other side of the coin, the need for technology introduces its own new barrier.

  • 19:52 - Steve makes an interesting distinction between the person and the ‘thing’ being investigated in an interview and he notes the importance of understanding whether you’re looking at the person in the research or their device itself. For more open-ended interviews looking at the person, it can be useful to follow Steve’s lead and send the subject ‘pre-work’. This could consist of one question or multiple but either way, it ought to be a small bit of work that the interviewee can do, without spending too much time, in advance of the interview itself. More than the answers themselves being of vital importance, the act of asking will get the cogs moving in the subject’s brain before the interview and could possibly give you a hand in opening them up.

  • 30:01 - Judging whether an interview has been conducted in a successful way is hard to gauge because it’s impossible to be sure what would have happened if you’d done it slightly differently. Steve’s perspective is that being a bright, curious, and extroverted person is the first step to being a good interviewer. He recognizes that this is the most natural way to approach this kind of situation; filling air time, talking a lot to show interest, nailing your questions, and putting some of yourself into that conversation. But Steve thinks that the next level up from this requires being comfortable with the unfamiliar task of leaving yourself at the door. Some of the best answers he gets come from saying very little, simply encouraging the subject to continue rather than hit them with the next question, and even saying nothing at all and letting them fill the space.

Episode Resources:

Other Resources:

Intro - 00:00:01: Creating great products isn't just about product managers and their day-to-day interactions with developers. It's about how an organization supports products as a whole. The systems, the processes, and cultures in place that help companies deliver value to their customers. With the help of some boundary-pushing guests and inspiration from your most pressing product questions, we'll dive into this system from every angle and help you think like a great product leader. This is the Product Thinking Podcast. Here's your host, Melissa Perri.

Melissa - 00:00:36: Hello, and welcome to another episode of the Product Thinking Podcast. Lately, the importance of user researchers have come into question, and I've been seeing a lot of angst towards product managers on LinkedIn, which makes me extremely sad. Many user researchers have been laid off because companies are not recognizing how valuable they are. I personally think this is wrong and that user researchers are an incredibly valuable part of any company. So today we're going to get into it. Our guest is Steve Portigal, user research consultant and author of Interviewing Users, which just came out with their second edition after first publishing in 2013. We're going to talk about best practices when it comes to user research, as well as how companies should think about hiring user researchers and how they work with product managers. But before we talk to Steve, it's time for Dear Melissa. So this is a segment of the podcast where you can ask me any of your product management or surrounding questions, and I answer them every single week. If you have a question for me, go to dearmelissa.com and let me know what it is. Today, we're going to the phones.

Caller - 00:01:39 Hi, Melissa. I work in a startup where the products consist of a hardware device that is connected to an app. Where could I learn about good practices of product management in this kind of product setup? Up until now, we had a separate person managing the hardware and the software part. But what are the industry's best practices? Do you recommend books or podcasts that could help me learn more about this topic? Thank you.

Melissa - 00:02:04 All right. So in non-software related products, it is different for the cadences that we do product development, but the basics are kind of the same. We have to strategize, we have to experiment, we have to validate, we have to develop, and we have to iterate. But when we're working with hardware versus software we do have different timeframes. Of course, it takes a lot longer for us to produce and develop something that's in hardware. And it's harder for us to actually iterate and experiment on it because we have to go back and release all those things. So we have to take that into account. But we're still doing kind of the same standard building blocks. Now, when you are trying to partner with both software and hardware and have them come together, they are typically led by two different people because the strategies that you would use in hardware and the way that you build it is different than what you would do in software. But those two leaders have to work together very, very closely to make sure that they have the same strategy, that they have a combined strategy, and that they are helping coordinate between the teams. So we have to understand how software is going to fit into hardware. We have to understand the timelines that it was going to take to actually develop and release on it. And we have to figure out what our boundaries are. So in some products, it sounds like you have an app. You can actually update apps. You don't have to update the hardware every single time you update the app unless you need some information from the hardware. That works well. So you can still iterate, you can still work on your software app where you don't have to be confined only to the way that they released hardware.

But in other instances, you're going to need to update the hardware to be able to get the information to the app, it sounds like. So this is where you need to work super closely with the hardware side of the business and make sure that you're in sync here and that you're getting ahead of it. So software has to make sure that we're coming ahead of their release cycles for hardware and make sure that we're going back and forth about what requirements we have and how we work together. I actually had Yassi Bayani on the podcast who worked for Fitbit and ran the software side of it. And she talks a lot about how she worked with the hardware side as well and what they did to come together to make sure that all was streamlined. So you can go listen to that episode. It's episode 116. And she'll talk a lot about how they did that at Fitbit, which as we all know, is a pretty common hardware company out there. Now, in hardware companies, you can also experiment. Apple did a lot of experimentation when it came to the iPad and the iPhone. They just don't talk about it super regularly. But we still want to make sure that we are building the right thing for our customers. So the more you partner on that, the more you work closely together, the better your products are going to be at the end of the day. So I hope that episode helps you out. And I wish you the best of luck with that. Now it's time to talk to Steve. Are you eager to dive into the world of angel investing? I was too, but I wasn't sure how to get started. I knew I could evaluate the early stage companies from a product standpoint, but I didn't know much about the financial side. This is why I joined Hustle Fund's Angel Squad. They don't just bring you opportunities to invest in early stage companies, they provide an entire education on how professional investors think about which companies to fund. Product leaders make fantastic angel investors. And if you're interested in joining me at Angel Squad, You can learn more at hustlefund.vc.mp. Find the link in our show notes. Welcome, Steve. It's great to have you on the podcast.

Steve - 00:05:11: Thank you so much for having me.

Melissa - 00:05:12: So you just released the second edition of the book, Interviewing Users. You wrote the first one back in 2013. What made you want to write a second edition? What's changed since then?

Steve - 00:05:23: Part of the impetus was just personal. It's been 10 years. This is an anniversary of something that, for me professionally, has been really rewarding and significant. It's the first book I wrote. And you know what it's like to put a first book out. It's just kind of this amazing accomplishment, personally. And I was really thinking about what does 10 years mean for me? And I started to kind of reflect on what that means for the field that I'm in, for user researchers? And I will say that Rosenfeld Media, the publisher asked me maybe a couple of years before that, would you ever want to do a second edition? And I kind of blew them off. Like, nah, why would I ever do that? And it was only kind of thinking that there was something that I wanted to do in response to this anniversary. And someone else asked me the question and said, would you ever want to do a second edition? And sometimes you're open to something and sometimes you're not. And the universe just kind of directs you there. So the second time someone prompted me with that, I thought, yeah, that's the thing to do. It's been 10 years. Of course, I have new things to say. This is all kind of on the personal level. I guess in my mind, I think I resisted it because I thought, oh, the book is about the techniques that we use of listening and asking questions and following up. And those are, they're very human. They're evergreen. They're just part of, you know, how we're wired and how we should practice. And I think that was a naive reflection on what the book is about and what it needs to be about. And it's, you know, in 10 years, the context in which we do research, who does research? And I know. We'll probably spend some time on that has changed. And if you go back some number of years, research was primarily done by agencies and consultants.

And so who's the book written for has kind of changed. There are now in-house researchers in vast numbers. There are now research leaders that build teams and build practices. And it just, it wasn't always this way. You know, there was not a robust practice of research operations that was there to make an organization, you know, equip them to do research. So the context in which we ask questions and listen to follow-ups and, you know, try to build rapport and obtain insight and so on has really, really changed over 10 years. And I think that realization helped me acknowledge to myself, oh, there was a lot more that needed to be said. There was, the context had changed. And I guess I'll just add to that, you know, over those 10 years, I have been doing research and I have been teaching research. So the story I want to put across has changed. I have techniques that I have refined, or I have stories of my own failures and successes, things that went wrong, things that were in the first edition that I would do and then realize, oh, that doesn't work, or I've ignored my own advice. And so I think I'm better equipped to, even if it's, even if some of the information is the same, I'm better equipped with more examples, more nuance, more reflection on it, just from living with it, doing it and talking about it for 10 years. So I think once I got over that hump of like, oh, I do want to do this, then it was just a really exciting, I don't know, kind of a torrent of like stuff that I wanted to say that I had to say that was new or better and kind of reflects the 10 years that we've all gone through.

Melissa - 00:08:36: What's one of the things that you changed in the book because you reflected on it and said, oh, that's different now or something has changed or the market has changed or the way we do research has changed?

Steve - 00:08:47: I mean, there's a really simple one. 10 years ago, my practice research was heavily in-person. Obviously, there's a lot more remote work. And so the infrastructure and the interactions have changed a lot. So that's a big one, but just a very tactical one. Within that, I always believed in giving people cash as the incentive, as the honorarium, as a thank you. And I think I started to see a lot of programs, third-party sites that would make it easier for you as a company to administer giving someone like an Amazon gift card or something. And I felt like those practices came about to make it easier for the research side and not the participant side. And so I've long been an advocate of like, let's make this the best for the participants. And so cash, like in an envelope at the end of the interview, thank you very much. This has been very helpful. Here's your envelope. Not only cash, but I mean, I'm almost embarrassed to say this. Like I used to like to give people large bills. If it was $150 for two hours of their time, I'd go to the bank and get like $150 and hand them crisp, new, high denomination notes. It's been a long time. Like it's, you know, clearly a pandemic, I think changed how we use money, but that's changed a lot. And so I'm almost embarrassed to describe this because it seems very dated now. Like if someone gave me a hundred, like what would I do with that? Like, even if someone gives me a 20, like maybe I'll use that in some transaction, but I'm just speaking in North America, I know, and like elsewhere in the world, right? They've moved into digital cash so heavily for so long. And so that's an example just not even research has changed, but just the way the world is, what we do as individuals has changed. And so I took the cash piece out. It was bad advice now, kind of over as 10 years have gone by and as the world has kind of moved on, that's a very kind of simple one, but that was one where like, I knew I had to update that.

Melissa - 00:10:47: I definitely remember going to the bank and getting cash for some early research studies that I did back in the day. And I didn't even think about that, like how that's just not a thing anymore because you don't see those people in person. I guess that's another one you were bringing up, like the remote research. How have you seen most research I imagine is done remotely these days? How has that impacted the way that we interview users or the types of user research that we do?

Steve - 00:11:12: The week that you and I are talking, I've just written my first client proposal to do in-home research. Like I haven't done in-person research since before the pandemic. That's many, many years now. And that's, you know, my whole identity as a practitioner is kind of around that. And so it's been interesting to see kind of what's changed and what hasn't changed or what's afforded by that. I think just with all sorts of remote work, there are things that it enables. Geographic diversity, you know, you can bring people together without having to travel to them. So I think, you know, in research, we would often sample, you know, you might do it locally if you're going to be in-person or you might pick some target cities. There's cities that are like good market research cities or that I'm in Silicon Valley. So you want to get out of the valley and get to where real people are or something that kind of creates some difference. Or you go to a metropolitan area where people don't own cars and where they use transit. And so that sort of density, metropolitan life creates ways that people use mobile devices or ways that people use delivery services. So you can kind of go to these environments. Rarely would you go to a rural environment where there was less infrastructure because you're not going to get access to as many participants for your travel dollar. So now if you do remote or you do a mix of remote and in-person, then you can get those rural folks or people with less access to certain kinds of infrastructure. You can kind of incorporate people that you wouldn't otherwise get to have in your samples. So I think there's some potential for more inclusion. Think, yes, if you're going to rely on access to technology, then you are now introducing a divide.

People that have consistent schedules, where they know they're going to be somewhere, where they're going to have a reliable device and a reliable internet connection. That does create some inclusion challenges. And is that better or worse than not? I don't know. I don't want to sort of value assess it, but I think it lets you revisit sampling and what your considerations are. I know I think as researchers, we can be very good at as much as the environment and you and I are in kind of a remote environment, not unlike, you know, a Zoom research setting or something. We can do our best at maintaining eye contact and asking follow-up questions, managing our body language and so on. It's different than if we were in the same room, but we can be okay at it. I think, you know, what I miss is what happens to me or if I'm out with clients or colleagues, what happens to us? It's not just that 60 minutes or 90 minutes that's in a living room or in an office versus in this box. It's that we go somewhere. If we're going on site to see how somebody's IT setup works, we're driving, parking, going through security, going into the elevator, going into the break room.

We see all sorts of secondary aspects of culture that explain or give maybe some context as to, oh, why is this IT administration done this way? Well, of course it is. This is a place sitting in there next door to a pig ranch or an IT cluster. There's all this other stuff going on that gives us context. As researchers, we get the benefit of literally being pulled out of our comfort zone and being a little wide-eyed about going into a different environment and meeting multiple people and just seeing. And this is true for consumers or business to business. Just personally, I really, really miss that. That's just some of the most stimulating. Unexpected, hard to plan for aspects of user research. So I think that is a loss. And, you know, I think everything is trade-offs. I'd like to see some more kind of hybrid approaches to research where we give ourselves and our teams the benefit of that sort of a shock to the system, like in a good way. And we get access to the most kind of broad cross-section of people that we possibly can.

Melissa - 00:15:06: I miss going into the offices and like seeing people too, or seeing them in their natural environment. On one hand, I feel like it's easier sometimes to line up interviews that you could do on Zoom and see how people are using things. But on the other hand, there's this type of research sometimes that's context dependent, right? Like where are you sitting? What are you looking at? What's going on then? How do you kind of determine as somebody who's about to go out and interview users when it's okay to say, hey, I'll get enough information. I can just do this one on Zoom or, hey, I really need to see their environment, right? Like it's better if I actually see what they're working with every day. Maybe if they're working from home, you don't want to do that, but you know, or ask them a little more questions. But like if you have to go to a site or see where they're actually experiencing the problem, when does that come into play? When does that become critically important to do?

Steve - 00:15:52: There sometimes, you know, the tail wags the dog, like the scope or the timing or the access. You know, sometimes there are just logistical reasons, the budget where it's not feasible or we'd have to. I think this is your question. We would have to make the case. This piece that I was talking about that I've been proposing to a client, someone else on the team actually suggested it. So I didn't have to kind of prove it. But in that case, we know what we are looking at. And in fact, this is kind of a value. It's work that's going to be evaluative. So we're proposing a solution. We want their reaction to it. But we know that it's a home environment with some medical conditions going on. And so there's multiple individuals involved that are providing support. And we expect that there are lots of workarounds in the environment that we want to see as opposed to ask about. So I don't know. We were kind of I don't know what we'll see, but we were just kind of riffing on it. And I was imagining, oh, you kind of thing where you're going to see sticky notes on the mirror in the bathroom. Or you're going to see food organized in the fridge a certain way. Or you're going to see little sort of work productivity tools that might have been brought into the home as a way to deal with or help either the caregiver or the patient kind of manage things. So we had a hypothesis about the context. Even though our question is to evaluate something, we also want to understand why we're going to get that feedback or even understand the opportunity. Like that solution is a hypothesis that says, here's what we think you need. And here. Here's how we think we can help it. The project is in a state where if we understand the problem better, even though we've got solutions that have been kind of sketched, then that's good.

They're open to that outcome. So if we know what the problem is and we're confident we know what the problem is and we want to evaluate a solution, then we're pretty far down that path of like, I want to say double diamond, but I don't literally mean double diamond. We're down some kind of, you know, development and ideation path. And maybe we don't need to understand the problem as much. And so we do. Like researchers, you can sort of look two ways. Are we looking at the person or are we looking at the thing? And obviously it's a mix of both. And I think this particular situation, like we are casting an eye towards the person. We know that we don't know everything about them and the situation, the opportunity, their workarounds, their tools, their pain points. I apologize for using that phrase. I think there are more closed ended questions like research objectives where it is a little more closed ended, where we can kind of. Take what we see and feel like that's sufficient. I mean, research is about what you don't know that you don't know. And how big is that aperture? Like how much appetite does anyone have for that question? I'm always going to be interested in that and sort of try to get to what I don't know that I don't know. And that for better or worse, I think there are so many closed ended things. We need to understand this thing. We're at this point in the development. We don't need to go to people. We've gone to people in the past. We understand this larger context. It's all taking place. It's all taking place on the screen. So the more you can draw a solid, not dotted line around the kind of what it is you think you want to know. These are sort of squishy metrics that I'm kind of giving. But I guess that's kind of how I tend to think about it.

Melissa - 00:19:07: That makes sense. And to me, it sounds like the problem that you want to learn more about, even if you have a solution, is it's not happening necessarily on a computer. It expands past that, right? It expands into the home and expands into the care that somebody is giving on a day to day basis. And in this case, it's not like somebody using a productivity app on a screen and just managing what they're doing from a computer perspective. It sounds like it affects their entire life, which would be a great reason to go see what they're doing outside of screens or out of software.

Steve - 00:19:39: I love what you said because there's almost like the quick and dirty heuristic, right? It's the thing that you want to research taking place in the digital environment you're going to be doing the research in. So could you or I screen share to look at some aspect of the tool or could I look over your shoulder literally while you do something? If not, now we're asking somebody to hold up their phone and show us or send us pictures or walk us around or just describe to us. And then that puts us at a disadvantage. So I think it's almost like a binary. Are we looking at the computer or are we looking at the context of the computer? That might be a very quick way to say we might want to see a little more.

Melissa - 00:20:18: Yeah, for the people who are maybe listening to this and they've got it where it's not just a computer, right? There's another environment too. But they can't necessarily get buy-in or budget to go to the place because I'm sure we'll get into getting buy-in in a minute. But like they can't get that buy-in or budget to go there. They would love to do it, but they're just getting denied all the time. Is there a way that people could share like photos or their environment? Like what's the next best way to get a user to open up when you're interviewing them about their environment or their context or what they're experiencing and seeing on a day-to-day basis?

Steve - 00:20:51: And just to blow that out a little bit, I mean, I think in an interview, let's just say the research method we're going to use is to interview people. There's lots of things that you can tack on to that. To me, sometimes it gets called pre-work. That's maybe an ugly term, homework. If you get someone to agree to participate and to talk to you, there's no reason as part of that request you can't ask them to do something else, something before, something after, whatever that is. So you can send them something and have them look at it. You can give them a Google form and ask them some open-ended questions. Tell me about a time where you were sort of show me a picture of something in your house or something in your work environment that symbolizes this and this kind of meaning. A thing that you've done with your devices that you're very proud of. A thing you've done with your devices that you wish could be better. They can be these very quick, when I say homework or pre-work, like it could be a five-minute activity or think about it for a week. And then, you know, take a picture and write a sentence. Those things serve to prime the person. Like it gets them thinking about this higher order aspect. I think I've had people take pictures of, right, it was some mobile device and we just had them take a picture. We sent them an activity every single day for a week before we met with them.

We just had them take a picture of where do you put it at night when you're not using it? What's the accessory that you care about the most? There were just seven sorts of prompts. So it didn't show me your contacts. It was. You show me the kinds of things that of course should be a little bit creative or reflective about it. Invite them to just send something. And it doesn't mean that now I fully understand their context as if I was there. Then that comes into the interview. It gives you something to talk about. Hey, can we go over these pictures that you sent? So you sent this picture and I was intrigued because it says you love this and this tells me more about X. And so you get all this storytelling that kind of fleshes out these images. So. I think the literal version of what we're talking about is like send me pictures so I can see it. But I think the more advanced approach is to take some pictures. So now we have some things to talk about that invite the participant to share more richly the stuff that I won't get to see. We're not replicating being there. We're facilitating a more reflective conversation and sharing by using some media by having them do or show something beforehand. That is. Relevant to or adjacent to the thing that you want to get them to show you when you're kind of talking with them.

Melissa - 00:23:23: That's a really interesting way of doing it. I like the idea of giving them some pre-work and then you've got some context to jump in and start asking questions about it too. I just ran into this issue, which I have not run into before, with participants in a research study in Germany. And we were interviewing them and they did not want to share their screen or have their faces up or anything. And they were very, very concerned about privacy. And this was not a healthcare product. This was not a financial product. It was not super sensitive. And I get those in those cases. But this happened a lot with the participants in Germany. What do you do in that situation? Do you just try to find other participants in user research or can you keep interviewing them? What happens if your whole population of people are unwilling to share? I guess this happens in finance as well because I've heard that from banks that I've worked with where it's like, I can't go directly to those people or ask those questions or ask me to show them how they use the bank account because it's got their personal data and I won't be able to see it. What do you do in those situations where people are not allowed to show?

Steve - 00:24:25: Finding that in a whole country is super interesting. Like what cultural thing that you didn't know about did you kind of stumble upon? So I think there's a couple of aspects here. One is sort of setting expectations clearly. And, you know, I think the situation you're describing is maybe one where you didn't know that you would have to set expectations on that because that's the thing about cultural differences. You don't know that there's a difference until you kind of come into it. So that's how everything is like from making a mistake. You know, I'm just thinking about times I would go home to talk to somebody about some device. We'd ask them, do you have this? Do you have a video camera? Do you use it? Do you make these clips? Do you share them with family members? And then I'd get to the environment and like, oh yeah, that's in my other house or that's in the house where my kid's dad lives. And so I found myself trying to conduct an interview where I thought I was going to see the technology and have them demonstrate. And it's not even there. So then my screening got a little more specific. And, you know, we want to meet you in the environment where the technology that we've just asked about is. And like, and so where would that be? It seems silly to have to ask that, but, you know, it's happened to me a couple of times. So do you start to say in your, you know, screening outreach, you know, we'll want to talk with you. We would like to have a face-to-face video call to conduct this. Maybe there's something about, here's how we're going to protect your privacy. I did work with a bank and their screener. They had a lot of boilerplate text that said that we don't want to see account numbers. We don't want to see balances.

And in fact, I think they probably had data privacy, either policies or regulations that meant they really should not be seeing that information. They were basically saying, they were trying to make the person comfortable and being clear about what they would want. Now, if you're in a, let's just say the, there's a German belief or preference. I don't know. Let's just take your sampling. I'm going to make it like a broad truth. Like if you know that no one in Germany is going to want to share that, do you sort of put that in as like a filter and try to find those German users that would be comfortable with that? And is that skewing something about, oh, now we're getting all the privacy, cavalier Germans, as opposed to sort of the more baseline Germans with respect to that. Is that going to skew your sample in some way? Or are you going to be like, oh, these are lead users, right? You sometimes go to people that are not representative of the population. But that will engage with you in a certain way. I mean, even people that agree to be in studies are, they represent a slightly different, different thing. So I think that's an option to kind of filter those folks out or to kind of go into it knowing like, hey, here's what this is. I mean, in the olden days, we did telephone interviews. They weren't the greatest, but we would call people and have them on speakerphone.

And I had a little microcassette recorder that I would put down on the table that would record what was in the room. Me on speakerphone. And we would, we would interview them. So it's possible there's a history of that, but we would have to be prepared for that and sort of think about, okay, how am I going to manage turn taking and pausing? And boy, if it's in another country, is it another language? Like, how do you, if I'm going to interview somebody that is not, I would guess, boy, a lot of Germans probably speak perfect English, but depends on who and where and what age, you know, you might have a language issue without eye contact. How are you going to manage? The signals being sent and received. Maybe there's different phases of the interview. We would like to have a camera on to say hello and get to know each other and do a little warm up. And then maybe that's before you turn the recording on. And so you get that person's energy and then you say, okay, let's turn our cameras off and put the recording on. So I guess just sort of parsing out what are the pieces that you can kind of play with. So can you record them, not record them? Anyway, I'm not going to repeat everything I just said, but I think trying to decouple a lot of that stuff and seeing what's possible and then kind of optimizing for that.

Melissa - 00:28:26: I think that's a really nice way to look at it is that you don't have to treat the whole interview or the whole study exactly the same. But I think filtering up front once you learn about something like that is good. We did end up filtering out the rest and we found some people who were willing to share. But security is a very big thing in Germany, it turns out. I know people care about it here, but they really, really care about it in the privacy concern. So it was interesting to work with users in a different light, which is always neat, I think, and a good part of user research. So one of the questions I want to ask you, kind of building off of this, was what have you seen people do wrong when it comes to interviewing users? What are some of the most common mistakes people make?

Steve - 00:29:05: Interviews are about letting go of ourselves a lot. And I think the extent to which you do that is maybe unfamiliar to people. And I think if you're new to research, you might feel okay about it because you're good with people. Maybe you have enough extroversion energy you can draw on to talk and listen and put forth a good energy. Be kind of like a bright, interested, curious, enthusiastic person. And I think that can get you a certain distance. But I think the great stuff comes from letting go of the importance of ourselves. And that sounds a little hand-wavy. If you're new to this, you come in with questions and you ask your questions, you get the answer. I'll put it in air quotes. The answer is the thing that somebody says after you've asked your question, and then they stop and then you go to your next question. So that kind of prioritize your next question a little bit as opposed to that person. So the more you kind of quiet yourself a lot. So the highest order thing is like, oh, I'm going to ask my next question. The next step under that is to just give a verbal acknowledgement like, uh-huh. Yes. Okay. You know, some sort of some way of saying that you've heard them. If you just do that and just acknowledge what they've said, often they'll keep going. I asked somebody a question. They gave me an answer. I just say, okay. And they say they have more to say. So, you know, that's kind of turning yourself down a little bit. And then there's just not even saying anything. There's the... You and I are doing this, or you're certainly doing a great job at it. Just you have the, I've heard you nod that lets me keep going, but doesn't make me feel like I'm talking into the void. Much harder over remote, right? You know, if we were in another, in the same room, we would have a little bit better. We hear breath noises. We have more body language cues. Like I don't get as anxious in a long speech as I do in this environment. So those body language cues are really important.

Melissa - 00:31:00: This was the hardest thing for me because when I'm used to interviewing users, I'll always be like, uh-huh, uh-huh. Oh, yeah, right? Yeah. Let them keep going and encouraging. On podcasts, you can't say that. Otherwise, I'm talking over you the entire time. So I have to learn to stay here and keep my mouth shut so that everybody can hear you instead of me, which I think is like a wild difference for it. But it is interesting because if you don't say anything and if you leave the pauses, people usually will continue to.

Steve - 00:31:23: And so you can keep experimenting with like, what's the minimum that you can do? So there's kind of like a half nod. Maybe this is more of a dude thing. It's kind of like the chin. It's kind of like the... Yeah, I like that. Yeah. It's like there's the yeah that you just said is like, that's all implied kind of in that. There's also just like an eyebrow raised. So I think you can keep playing with what's the least amount of intervention that you can do, because I think we tend to want to overcompensate that. And that's all about us trying to take care of the other person. But by kind of putting ourselves in there, like they need all this for me, they need me to be like, super there with them. And so sometimes you see the almost breathless, like, yeah, yeah, yeah, yeah. Uh-huh. And then that goes wrong when you start saying, okay, cool. Yes. Got it. Like where there's, those are all affirming what the person says. And so now you start your positivity about the person starts to be kind of judgment, frankly, about what they're saying. And then you start saying, oh my gosh. Oh, oh no, that must've been awful. And all that is not needed. I think the rapport that you build with people, which is super, super important. It comes from how well you listen and not how much feedback you give. And so the instinct is to try to establish that rapport by giving all of this. And so sort of playing with what starts to feel uncomfortable to you. Oh, I'm being cold people, but they don't want to be cold. But in fact, if someone tells you something and then you ask a really great follow-up question, as opposed to going on to your next question, that person feels so heard, which I think is what people are trying to create that feeling.

So using the tools of your next question, and so when you ask follow-up questions, when you say you mentioned before that, or I'd love to hear a little bit more about those, I'll tell that person, like, I heard what you said. It's important to me. I want to know more. I'm really into what you are saying. So those are much deeper ways to, to signal that. And that just goes against all of our social training, all of the ways that we interact with people. We want to talk, we want to kind of give feedback and acknowledge, and maybe even talk about ourselves. The corollary to this, I think, is important that these are all sort of multi, you asked for one, I'm giving you a lot, but they all sit together. People often talk about themselves as researchers, interviewers talk about themselves as a way to build rapport. So someone says, you know, this, this, uh, you know, I downloaded this update over the air and it ripped my device and I had to go into the store and get it hard reset. And the interviewer says, Oh yeah, that happened to my cousin or that happened to me. It was really awful. And the thing is, you don't need to tell that person how you're like that. That seems like it's a shortcut to rapport building. Tell them I'm like you, I share that concern.

Oh yeah, I love that show. Oh, I couldn't believe how great that trailer was. Oh yes. We love, uh, that ranch topping. We put it on everything as well. There's just no need to do that. It makes it about you, the interviewer and the goal in the research is to make it about them, the participant. So how did you know that? It was bricked. How long before you were back online? Have you changed your practices when you get updates in the future? How did you learn about recipes for a ranch topping? Whatever you just keep making it about them. And that tells them that they're important. That's the rapport that you're trying to establish. Talking about yourself is like a very rare situation to do that. It's the kind of when the person is awkward or unblocked and you want to bridge something. It's interesting that we're not using cameras today in the US we're very much camera focused. And it seems like you and I have different cultural backgrounds around privacy. Like you say something about yourself in order to bridge something that's awkward. Whereas otherwise you wouldn't, you wouldn't, you wouldn't talk about yourself in that setting. So just to feel comfortable again, you're sort of quiet. You're not saying, yeah, yeah, yeah, cool. You're not talking about yourself far less than what your expectations are. I think that's the difference between sort of new and more seasoned and thinking about research.

Melissa - 00:35:32: Did you know I have a course for product managers that you could take? It's called Product Institute. Over the past seven years, I've been working with individuals, teams, and companies to upscale their product chops through my fully online school. We have an ever-growing list of courses to help you work through your current product dilemma. Visit productinstitute.com and learn to think like a great product manager. Use code THINKING to save $200 at checkout on our premier course, Product Management Foundations. What's really interesting to me about what you're saying, too, is that there's such a fine line between empathizing with somebody in a way that is going to get you the right results versus over-empathizing and turning it around back on you. Right. So I feel like when you were talking about how you could say, oh, wow, that must have been awful for you. Right. Or, oh, yeah, I went into that store. You know, my cousin went into the store and had the same experience. It kind of reminds me of I think everybody's had this friend where you go and you want to tell the friend about your day and then the friend turns it back on them and they're like, oh, my day was awful, too. Right. And you can kind of come off as that person without realizing it. Like you're trying to one up the user or be like, oh, yeah, I've had those problems, too. And that's not really the objective or what you're trying to do there. You just want to be a good listener and learn. But I never really put that back into the context until you said it just now of like, yeah, I don't like that in social interactions. Why would a user like that? And you could be sitting there thinking, oh, I'm empathizing. I'm just like building a bridge here. But instead, you're talking about yourself and you should be talking about them.

Steve - 00:37:03: That's a great example.

Melissa - 00:37:04: So when you go in there, from what I'm hearing from you, it's not about saying like, oh, yeah, I've had those experiences too, or, you know, that was bad. It's more about just encouraging them to talk, following up, deep listening there, those nodding, encouraging people, helping them go and not moving from one question to the next. Is that the way that we dive in? Yes. So from that perspective too, this is how you go in an interview. What about one of the big issues that I see, and I'd love to get your perspective on this, is the way that people prepare for user research. And this is what I've run into in product management a lot. I've seen a lot of people wanting to go do user research, get buy-in for user research, but they don't sit there and actually think through what they need to learn. And then make sure that the questions and the people that they're actually asking questions of are going to help them learn those things. And instead, they're like, let me just go talk to users. And what I've found has been really successful when I've gone in and done user research, or we have to do a ton of user research when I come in and help companies with product strategy. That's usually a big thing that we do. When we go out, we interview users. But we have to write down, this is exactly what we're trying to learn. This is the objective of this research and not that. And I find that if you don't do that at first, you can come back with a hodgepodge of things that don't really tie back to what you want to learn. How do you encourage people who are going to go out there and start to interview users to do that? How do you encourage them to like sit down and what should they do to make sure that you're set up to get the best information before they even go talk to a user?

Steve - 00:38:33: I kind of have this, call it a framework or a hierarchy, or there's a bunch of things that you want to understand ahead of time. So I talk about the business problem or the business challenge, which is, right, we need a new product strategy. I'm sure there's a more nuanced way to say that, but let's just say that's the thing. And then there's the research question, which is, what do we need to know from people in order to answer that? So what's the research meant to answer the strategy question? I'm throwing to you to give an example here.

Melissa - 00:39:01: So for me, when I go in and we'll do product strategy things, typically what we do is we start out by looking at company data. So we will start slicing and dicing and we'll go, for example, like, what are your types of customer personas? Sometimes they don't have personas. We have to kind of figure out, OK, do you have people from a certain industry that's in here? Is it like mid-markets, enterprise? So we're segmenting every possible way that we can. Then we're trying to figure out who's going to be our ideal customer persona. So like ICP work is something that I've been doing a lot of lately with companies I've been working with. And we'll dive into that and we'll try to find trends. And we want to know more about trends. So one of the research questions that might come out of this ICP work is, is there a particular industry where our product is better suited than others out there? Are there specific people on our platform who are utilizing all the products, not just one? And if so, what are they doing with it? What's their actual use case? How are they interacting with it? When did they go into it? And are they happy? So those are maybe two examples of research questions that we would want to learn so that we can then go into the product strategy and say, hey, there might be something here from that use case that we can bridge out further. Or these people in these types of industries or these types of customers might be better suited for the ICP. Maybe we should start to tailor it more towards them. So for us, it could change depending on what kind of research we're doing. But those are two common ones that I just did recently.

Steve - 00:40:22: Those are great. And so you see like, what's the business challenge? And what is the research question? What does the research need to identify? And then you also need your research method. Well, what are we going to do to answer that? And so you've got some different sources of data, some of which you've already been looking at. You're going to, let's say, go talk to people. And so you put together a research plan and approach that's based on the research question, which is based on the business challenge. And then you have to, let's say there's going to be interviews. You have your, it's your participant questions. It's the things you're going to ask people. So you want it, you kind of said the research question is a little bit about how are people kind of going using some of these tools and what's working for them and what's not. I'm paraphrasing you poorly, but the interview then you have a discussion guide, right? Here's how we're going to run this session. So you're kind of working out the mechanics of that. We're going to ask these things. We're going to have them show us these things. The things that the research is meant to uncover are not, not necessarily the things you literally are going to ask people. You're going to ask a bunch of stuff. I liked your research question really. It's not what are people going to tell us, it's what are we going to conclude? What's the point of view we're going to come back and report on? So the research question isn't like, do people like something? It's this larger umbrella. So those four pieces, what does the business need? What do we need to learn from people? How are we going to go about it? What are we going to ask them? Those are all integrated. They're all interdependent. And I think, you know, the way projects start. Is all over the place.

So sometimes you get told like, oh, we have to go ask people this question. And then you have to like ladder up and say, well, what is it we're trying to learn? And what is it we're going to do about that? And so that helps kind of position research as a more valuable effort. We're not going to just ask somebody a question. We're not like the, you know, single question asking business. We're trying to tie something to, you know, an effort, an initiative. So preparing all that stuff and not doing it in a vacuum. So this comes from. Whatever, if there's a business challenge, someone has raised this as a challenge. Someone has brought you, you and your folks into like, you know, build this product strategy. So figuring out who the, I mean, stakeholders is a broad term. Sometimes you figure out who it is. The, uh, where they use a racy model, right? You've got the, who is the responsible and who's the, I forget what A is, but the consulted, informed, accountable. You have some structure of like, who are the people that we need to talk to about this? So once you get down to writing a discussion guide, maybe the project sponsor doesn't need to review that, or maybe they do. It really depends on the dynamics, but if you want, I think the person that owns the research project should be moving forward and progressing and building like, here's the research question, here's the research method. We're going to talk to these kinds of people, these companies, these roles in these companies. And what you're trying to do with sharing some of that planning is get people to object early and say, hey, why aren't we talking to, you know, the oil can company, you know, in Amarillo, Texas? Like they're one of our most loyal customers.

Okay, well, we are going to talk to them or we're not going to talk to them because they represent data we already have and we're going to do that. You want to sort that out now, not at the end where they say, hey, how come we didn't talk to the oil can company? Like that's when your results get discarded because you didn't follow the process that somebody assumed you would follow. So it's very inexpensive to get buy-in on the logistics, assuming you've got buy-in on the efforts. Here's a discussion guide. It's in a Google Doc. Look at it in the comments. It's not a script. We're not going to use it literally like this, but here's what we think the topics are. Why aren't we asking about the cost of acquisition or a cost of configuration? Or why are we asking about this? Good. We are going to ask about that. I'm glad you told us that. Or we're not going to ask about that because we have that data or we're not going to be able to make use of it or it's outside what this effort is focused on. That's good for a follow-on study. That's not what we're going to include here. So as much alignment as you can kind of establish up front. So not only is that preparation good for you, the researcher, the person doing it, so that you are doing the research that's needed and wanted, but it's really good for driving the impact of the work because people are brought into some of the logistics or they at least are invited to look at it. You know, whether they are going to critique it or not, you at least are being transparent and collaborative in that approach.

Melissa - 00:44:48: Yeah, and I think that's so important for getting buy-in to go out and do the research. I've heard a lot of people complain. And right into me on the Dear Melissa segment of the show too about how do I get buy-in to do user research? Or I work for some founders and they think they know everything about the customer and they don't want to go do user research. And for me, what I've found makes a difference is how you present that plan and how you show here's things that we know and here's things that we need to learn. And this is the way that we go and learn it instead of just generically saying, hey, we should do user research. What other ways have you found people get buy-in to go do user research or when they're in a company and people don't really want to spend time on it?

Steve - 00:45:27: Right. I think one thing is to get at what the objections are, right? Is it about time? Is it about money? We already know everything, sometimes is fear. What if we're wrong? Your pitch has to be kind of tied to an understanding of what the, what objection are you trying to overcome? So yeah, we already know everything I think is a conversation about risk. And I like what you're kind of describing, like, here's what we know. Here's what we suspect. Here's where the greatest risk is. So there are, you know, there are these various workshops, like a questions workshop or an assumptions workshop. You can do these kinds of things. So you're maybe inviting somebody in to participate into a planning activity. I think research could be seen as a solution looking for a problem. We want to go talk to people like, well, why? Oh, because we're going to solve this. I like your sort of way of flipping it around. Like, here's the initiatives we're embarking on. Here's the questions that we have. Here's where the greatest risk is. So I think I really like that sort of how you pitch it. I talked to someone, their lens was, what challenge are we facing? And what have you tried in the past? And did that work or not? And so if it didn't work, then we still have a question. So we're going to have to get an answer to that question, right? Sometimes, you know, we think we know everything, I think companies are led by, and this isn't a criticism, but they're led by aspiration, right? Like if you're going to be a founder, you have to believe in a future that doesn't exist yet. And you have to believe that you can make it happen. I think that energy and that confidence is rare and is important. So, you know, positioning what you want to do as a yes and to that. And I think, you know, that comes up in how we report research.

Like I think, you know, you're talking about buy-in to get to do it. I think one of the ways to get buy-in is not only reframing the request, but also reframing what you have to say. So again, here's why we do all this alignment at the beginning about what the opportunity is, what the problem is, what the approach is, what the questions are. I love being able to come back and say, we thought X, or we assumed X, or we hoped X, or we're predicating this approach on X. What we heard is the delta between X and X prime. To be able to say what we learned in the research as a yes and to where we all started. So it's not, guess what? People love chocolate in their hair dye. What is that? We started off with some assumptions about flavors and celebration and ordinary tasks. And we had this scenario and this scenario. And while those are true, the greatest passion is around these kinds of things, or the greatest expectation, or the biggest concern, or the biggest fear is this.

So it means research is building on knowledge and beliefs and hopes and surmises and so on. It's not just decontextualized. So my hope is that when I come back, and say, we started off here and now we're here, that I'm just like in the interviews, right? In the interviews themselves, we're kind of building with people. We're going with what they told us before. We're kind of having a conversation together. My hope is that if we do that, when we deliver research, we are creating more buy-in. I worked with a woman. She used to say, oh, we just fed them a burrito. I think that was kind of her model. Like you'd come in with all this stuff and the person has to just eat a whole burrito. It was, I think, a volume of food sort of data. It was just presented very quickly and it's just sort of too much to kind of absorb. So one way to make that more palatable is to sort of build on where we were before we started this. So you're trying to change the dynamic before you do the research and after you do the research. So your buy-in is a longer game. So changing how you kind of conduct yourself in all those stages starts to kind of shift the dialogue a bit.

Melissa - 00:49:08: I think the best researchers I've worked with as well are really good at the contextualizing piece at the end, right? Like they take all of it and they distill it into information that's appropriate for whatever audience they're presenting to. And they're really good at tying it back into this is what we're trying to learn and this is why we're trying to learn it. And here's like the key points that are going to make a difference for what we need to do. Or we came back with inconclusive evidence about certain things. And that means that we need to double down or go on this path. And life-changing when I've worked with user researchers who've been able to do that. Just amazing to watch it all come together. And it was a big part of the work that I used to do as a consultant as well. Like I do it more as a board member now. And we have user researchers in these companies who will go out and do it. And I'll help them figure out what we need to learn and what we need to ask. But they'll go do the research. But when we were consulting, I had a user researcher who did that for us. And it was amazing. Like when she'd come back and could boil it down. And then we could talk about how it, you know, fed into the roadmap and how users thought about the current company and how that's going to help shape the product strategy that we were thinking about and things like that. It was so delightful too to watch these CEOs go, oh my God, like this is amazing. I just got up to speed on what's happening in my company in an hour. And now I know what I can do.

Now I know what to act on. And I think that enlightenment from good user research is just so amazing. Which kind of caveats into the last thing I want to talk to you about, which is this. There's been some angst out there about, do we need user researchers? And I've always been in the camp, even though I'm a product manager, of like, yes. Like I said, I had a user researcher on my team when I was consulting, and she was amazing and did great work. Couldn't have done it without her, right? And the value was there, right? The value to me was there. And I continue to do it as a board member. Like, we pick up a user researcher in the company, and I work with them pretty closely. I have never felt like we don't need user researchers in companies, but I'm hearing online and in LinkedIn debates that there are companies out there saying that we don't need user researchers, that product managers could do it, UX designers could do it, or this is an unnecessary role. What have you seen shift out there talking to people? Well, have you felt that sentiment at all? That's what I want to know. Have you seen that at all? And where do you think about product managers versus UX designers versus user researchers going out and talking to customers? Where's that role that we pull that back into?

Steve - 00:51:33: I mean, I've long wondered about user research, the need for user research and the need for user researchers, right? Like one is an activity and one is a job title or a role, you know? And that's definitely something that has changed over the years. Like who does research? Well, I don't know if it's changed. I think maybe people have always been talking to customers, whether they had a label for it or, you know, we have a term or we have books and conferences and courses you can take. So now it's sort of professionalized as a thing. And I really like Kate Towsie, who's an important person in the research operations world, research ops, came up with this term, people who do research and even call it PWDR. And if you're really nerdy, you know which letters capitalize, which one isn't. And just that word, like putting that word out there, you know, when you give something a name, it really explains a lot. So this idea that there are researchers and there are people who do research and that they have different needs in terms of, you know, what they need for support, infrastructure. Structure, guidance, mentorship, all that kind of stuff is very interesting. The other word that gets thrown around is democratization. So that, I don't know that I'm going to be able to define it so that everyone will agree. But I think democratization refers to the move to people who are not researchers or on the user research team doing research. Or maybe officially creating support infrastructure for that. Because again, it's been going on. As long as there's been companies with customers, there's been people talking to those customers. So yeah, I may be misrepresenting democratization. So I think there are companies where the belief in the buy-in for user research and the buy-in for user researchers shifts over time.

And I think it shifts based on who's the CEO. I mean, depending on size, depending on maturity, what's their product maturity, what's their UX maturity. They hired a new design leader that came from here and they believe in this. So they're going to bring in a user research leader. I think the sands shift constantly. And now we have within tech, just a big economic challenge, right? I saw someone follow up a user research job that they posted and said that they got 3,000 applications for this user research job. So that means there's 3,000 user researchers who are looking for work. I mean, maybe there's 20,000. I don't have any idea of the size of that, but... One of my colleagues reflected on some of the heated debate they saw online. They said people are scared. People are under stress because they've been out of work. People have gone through multiple layoffs at this point in the last few years. I think certain groups feel singled out. And so when people are under stress and they feel sort of vulnerable as a group, like, is my profession viable? And I'm seeing a lot of sort of think pieces that say like, yeah, don't join this profession. And in fact, I'm getting out of this profession because I don't think there's going to be work for me or for all of us.

So I think that creates like a, it's a heated environment, right? And so when there's another group to kind of point to, I mean, that's sort of the Lord of the Flies, maybe too overblown a metaphor. But you see that we're going after each other a little bit because it's easy to demonize another and say like, they're the reason that I don't have a job because this thing changed and now no one will hire me. So I think we're in the thick of it right now. And, you know, we have been for a couple of years. So is this an inflection point? Are we shifting to something different? And what's that going to be? Or, you know, pendulum swing, right? If we're old enough, we've been through different kinds of economic downturns and seen how tech, if we work in tech, responds to that and what they'll spend money on and not. So I don't know. I don't really, I think I've sort of not answered your question, but I've sort of said, I think there's just all these kind of forces and factors that are like really hot right now and that are pressing on people and leading to some, you know, the need to expound and conclude and say, here's how research should be run. Here's how I should do it. I see a lot of job ads for researchers. People tell me those aren't for, you know, there's a, they'll hire leaders and junior people, but mid, mid-level people are not able to find work easily. I don't know if that's true. Again, these are sort of all, it's all kind of anecdata.

Melissa - 00:55:46: I just learned about the angst, I think, like two weeks ago. And it took me by surprise. You know, I saw a LinkedIn debate screaming at Teresa Torres for ruining research with her continuous discovery book. And to me, I was like, I think Teresa is one of the people who are out there championing research. Like people didn't want to do research. And she was like, you should do more of it. Like, I don't see where that's coming from. But I don't think that we should be looking at it as product managers or user researchers. Like, I think there's a place for UX designers, a lump in there as well. Like, to me, there has been very specific things where I want to use a researcher. And the way that I've structured teams, and I'd like to hear your opinion on this too, is, of course, it's for like midsize to larger teams. It's like you want somebody who's great at user research and who's deeply knowledgeable about it, right? To help make sure that we're doing good quality research, to go out there and do the type of research I was talking about for the product strategy. Sometimes, like, I don't have time when I'm doing something big, overarching. I'm not going to go pull a product manager off their work to go do this. And this is the type of work that I want somebody to keep on top of throughout the year. Like, we might not be running a research study like every day, but we could be running them multiple times per quarter to make sure that it's actually coming out the right way when we start to deploy that product strategy and put it out there. So I want the feedback. I want to know what's going on. I want somebody on top of that. So that's where I see user research being really important. I want somebody there to help guide new product managers.

I want somebody there to teach them how to do good user research, but also to help them get set up, show them, let them shadow them, let them go into that. So to me, there's a lot of those pieces, but I don't see it as, like, the user researchers are the only people who can talk to customers. I also do think we need to let capable product managers out there to talk to customers to learn what they need to learn. Same thing with UX designers. UX designers tend to have a little bit closer of a, you know, mesh with user research in there, but not all of them. Some come from, like, a very specific design background, and they haven't been doing super deep user research. But the same thing. I think you get user researchers in there to make sure that we're doing good quality research and getting good quality feedback no matter where we are. And that's where I've seen them there and help structure that. Make sure that there's, you know, places for research op stuff, places where that lives. Make sure that we democratize it. Make sure that we can learn about it. Make sure that we know our customers deeply. To me, I think that's critically important, but I don't think it's like one replaces the other. And I was so curious as to why. People were hearing that from organizations. And in my opinion, if that's what's happening out there, I don't think the leaders understand the importance of great user research. And they probably haven't done the type of stuff that I'm talking about. Like, I frequently, when I help executives, it's the first time they've done a research study like that, where they've actually had somebody go out there, break down those users and start interviewing them in that way. Like, they just haven't done it before. And I think if we can demonstrate that more with user research, I imagine that the leaders will see how valuable it is. Like, the CEO I'm working with right now. This ICP thing. He's like, this is the most valuable thing we've done in a long time. Like, this is great. Like, I love this guy. Love this researcher. He's fantastic. Great. They see it. They experience the benefit themselves. That's where it all starts rolling. But I was shocked when I started to hear about the LinkedIn stuff.

Steve - 00:58:59: Yeah, and I think some of it is, maybe it's boiled over now, but I suspect this has been brewing in different kinds of corners for a while. It's the notion that researchers can mentor, advise, coach, facilitate, teach others is one that I think makes sense. I don't disagree with it. I mean, we're here talking about a book that I wrote that helps other people do research. So, of course, I'm on board with that. But the idea, and someone asked me about this the other day. They didn't put it bluntly, but like, why is my job to help other people get better at the skill that I was hired for? I feel like as researchers, people that kind of hold that identity, we believe in asking good questions, learning, gaining insight about customers, bringing it to the organization, helping the company make better decisions. And I don't think we want to, and I'm speaking like we as if I own the we, but I don't think we want to gatekeep that. We want to facilitate it being done well. So I think there's been a lot of coherence in it. I'm going to help this product manager get better at doing research. I'm going to give them some feedback on their guide or their technique.

Like that makes natural sense. I think given what draws people like me to this profession, but it is an interesting imbalance. If someone's going to exploit that, where it's my job to not only do research, but also train you to take my job. If that's sort of the deceptive pattern that emerges. And I don't think that's the case, but you can see again, when people are under stress or feeling that there's limited resources that we're all competing for, that it kind of comes out that way. But I really like how you're talking and I think, I don't know if this is true for you, but I feel like I haven't met an organization where there aren't more questions to be answered than there are resources to answer them. And I think you're kind of getting at, you know, trying to match appropriately the scope and the cadence and the level of the different questions to the resources that you have available. In that case, then like the company becomes, I'm probably misusing this term, like a learning organization. The company is truly human-centered, customer-centered when it's for all of us to learn about people. And maybe that's a little like a hand-wavy, utopian kind of view. But I think, I guess philosophically, that excites me. We're all going to learn about people and answer different questions. So all the decisions we make are good ones. And we're going to sort of figure out who's going to do what based on bandwidth is an important one for sure, but also kind of right-sizing skill and nuance and complexity to that. And that we're all going to help get better at that as part of it.

Melissa - 01:01:36: One thing that you did say that I wanted to touch on there was how there's not a lot of mid-level opening jobs for researchers, right? But isn't that a mid-level leader's job to train others and level them up into the ability to go out there and perform their task, like to coach them? So I think if you're thinking about your job as a researcher of like, why should I go out and train all these other people to do my job? Is that really a good mid-level manager type mentality? That's how I would think about it. If you wanted to be a director of user research, let's say, like a mid-level person out there, first of all, I think you have to understand that there's not a lot. As soon as you start moving up the ladders of companies, there's not a lot of positions available, right? It goes down drastically. So in every industry, same for product managers, there's a huge problem with product managers moving from a senior individual contributor to director. And a lot of people are like, I can't get the job. I can't get the job. There's just less available, right? There's fewer jobs available that way. There's also a certain different skill set. But it's very hard for product managers so I hear that a lot, too. From a user research perspective, if I was going to hire a director of user research and have user researchers underneath them, though, I'd expect that director of user research to set the context of how we do research in the organization, which also does help with, hey, how do I make sure that if a product manager goes out there and does talk to people, they come back with the right things? Because I know that my team might not be able to do it all. And I do think that's a mentality that every mid-level person should carry with them is, how do I enable this organization to do great work? And how do I ensure that the scope of my work and the quality of the things that I oversee is done well, no matter whether it's done by me or done by somebody else.

Steve - 01:03:13: I love it. I'm just going to give it a thumbs up.

Melissa - 01:03:15: Okay. I was like, I don't know if I'm thinking about that wrong or if that's drastically different and people are going to yell at me for it. But to me, I desperately see the need for better user research in companies. So I hope people are going to keep hiring user researchers. I see a great need for them alongside product managers and UX designers. But I also think your new book to interviewing users is going to help a lot of people out there learn how to do better user research, which is great. And hopefully get out there and start talking to customers more. So thank you so much, Steve, for being on the podcast. If people want to go buy your book, where can they buy it? I'm going to direct people to the publisher, RosenfeldMedia.com, because you can support small businesses that way. Plus, for listeners, viewers of the podcast, there's a 20% discount code, which is part of the name of the podcast. That code is only good at RosenfeldMedia.com. And I think Rosenfeld's really good at, if you buy a print copy, you get a digital copy as well. And I don't think other publishers, other places of selling do that. So you get kind of both. So I'm happy to support the publisher, and they've been supportive of me, and I'm inviting people to do that as well. And if people want to learn more about you and your work, where can they go?

Steve - 01:04:28: My website describes my services, has stuff about the book. It's mylastnameportugal.com. LinkedIn is where some of the debates we were talking about happen, but I tend to post about what I'm doing and what I'm thinking and where I'm speaking. And when this goes live, I'll post about it on LinkedIn. So follow me there or people are definitely welcome to connect with me there. I look forward to hearing what people's thoughts are, disagreeing or adding more of their own experiences, some of the topics we've talked about. I'd love to hear from people.

Melissa - 01:04:57: Great. We're going to put all of those links in our show notes at productthinkingpodcast.com. So make sure that you head to the website and go check out Steve's book. You will also get the discount code there. We'll link to it as well. And you can see Steve's website at that website. So productthinkingpodcast.com. Thank you again for listening to the Product Thinking Podcast. We will be back next Wednesday with an amazing guest. And as always, I'm answering your questions on our Dear Melissa segment. So go to dearmelissa.com and drop me a line about what you want to learn about. We'll see you next time.

Stephanie Rogers