Episode 66: Answering Questions About User Research

In this Dear Melissa segment, Melissa answers subscribers’ questions about when research is stale and needs to be revalidated, when and why to schedule frequent customer interviews, and understanding product success when conducting product trials.


Have a product question for Melissa? Submit one here and Melissa may answer it in a future episode.

Subscribe via your favorite platform:

 
 

Q: Do you have any tips for identifying when previous validation of knowledge could be stale and might need to be revalidated before continuing work? [2:22]

A: Let's talk about what you can actually do to see if you need to do further research before starting any type of user research… [You should be] identifying assumptions with your customers, technology, products, markets - all these things that will make this product successful. You need to store these assumptions with your user research. If you haven’t looked back at what you believed to be true two years ago versus what it is now going through all those assumptions, that’s where you should start… When you’re completing validation work, if you didn’t validate it, you should be tossing it… You shouldn’t be doing research just for the sake of research. [3:48]


Q: What’s your advice for booking customer interviews? [9:17]

A: You should be talking to your customers frequently, but I don’t know if you have to do it every week… they might not want to sign up for these calls, especially if you’re going after the same type of customer every time… Whenever you go into a customer interview, it should be with the intent to learn about their problems and how they’re using the products. Typically, it’s good to go in with ‘I’m exploring problems in this area’ and being open to the aha moments and the unknowns that surface up during these calls. I would say that the reason companies get weird about doing customer research is because a lot of people do research for research’s sake and not with the intent to learn about something specific that can help them… Continuous discovery, for me, is constantly keeping touch with your customers to understand emerging problems, but that doesn’t mean you have to do generative research all the time. [9:43]

Q: Do you have any tips beyond the usual customer surveys and interviews for understanding product success? [16:32]

A: When you need to deeply understand your customers before you build your products, make sure you’re doing the right research upfront… Don’t wait for product trials to get your first moment of feedback - do your understanding of the user first. [Now for] testing your product… cultivate a cohort of users to get qualitative feedback and then try to quantify it. The best way to set up your trials is to make sure that you have people opted in and willing to try it, who have been vetted as the right persona for this thing. [16:54]

Resources

Melissa Perri on LinkedIn | Twitter

MelissaPerri.com

Transcript:

Melissa:
Hello. And welcome to another episode of dear Melissa. Today. We're talking all about user research and I've got three great questions for you, but before we dive in there, I wanna remind you that I'm answering your, your questions, right? I'm I'm waiting to hear what you guys wanna learn about. So if you have any questions for me, they can be career related. They can be product and process related. They can be strategy related. They can be situation related like, Hey, I'm in a weird situation here at work. What would you do? I'll take em all, right? And we'll anonymize it. We'll take all the facts and I'll tell you what I would do, but this is the way that you can get my feedback and hear about your particular situations. What I would do there. This is the only way I'm answering questions like this.

So please go to dearmelissa.com and submit all of your questions. I'm super excited to hear what you want to learn about and what you're dealing with. It helps keep me in the loop and lets me know how things are changing too. So with that, I'm gonna dive into our first question here, dear Melissa, in your a recent episode with Adrian Howard, you talked about product people validating with customers too late, like when development has already started and then being reluctant to throw away work, even if it seems like the right choice, but I have the opposite problem. I have a new feature that my dev team is only just starting to work on, but the research in UI UX validation was conducted two years ago. A high risk that the validation work is no longer valid, especially given that the way that people have worked has changed so much over the past two years with COVID, it's not uncommon to complete some research in validation work, but then temporarily shelve a project before it's it started. It could be that other priorities come up or it's just not the right time to develop the feature for the business and the customers, or, you know, for some other reason, pandemics aside, hopefully we don't have a culture shifting event like that. Again, any time soon. Do you have any tips for identifying when previous validation or knowledge could be stale and might need to be revalidated before continuing work? Yes, I do. And we are going to answer this question right after this message.

All right. We're back and we're talking about how to know when your research is stale. So I've got a couple questions for you to think about as well, but let's start with, how do you know research is stale. Um, one way to really see if you need to do more research. And I'd say like two years is probably stale, but maybe, maybe you just waited six months. Maybe it was two months. Either way time has happened. Things have happened to the customers. Things have happened to your business. Things have happened to your product. So let's talk about what we can actually do to see if we need to do further research. Before you start any type of user research, you should be really looking at what are your assumptions for building this new feature? So we're identifying assumptions with our customers, um, our technology, our products, our, uh, markets, you know, all these things that we think is gonna make this product successful.

You need to store with your user research, all those assumptions. And if you didn't try to go back and think about what do we believe to be true two years ago, versus what is it now going through all those assumptions. That's where you start to look at. Okay. What has changed? Um, are we assuming no new products have come out there to actually solve these problems. We were looking at two years ago, you know, how are they solving them now? Did it change? Do they still have these problems? Did they go away? Did something else happen? Right? These are things that we have to really dig into. And that's where the new user research comes up. Now, technically, if you did really good user research and you identified your assumptions early on, it shouldn't be as much research to refresh this, but you're still gonna need, need to do it anyway.

And a good way also to think about when I need to revisit my research or, you know, think if this is fresh or not, what, what were the market conditions? What were the customer conditions? How have they changed? So lets pretend like there was no pandemic, um, over the past two years, like did competitors come into the market? Um, is the market still the same as the way it was before? Did something shift outside that may have caused something different with our customers? Those are the types of things you need to be looking at and they will help identify if your research is still or not. Uh, two years is a very long time though, but I have a second question for you as well. Like you said, it's not uncommon to complete some research and validation work and then temporarily shelve it. Uh, ideally when you're completing validation work, if you didn't validate it, you should be tossing it.

So I wanna make sure we are actually doing that, right? Like we're not just completing the research and the work and then being like, yeah, it didn't really pan out. Let's shelve it and look at it later. Like we, we need to say no to some of these things. So I wanna make sure that the research and validation work is being done correctly. Now this shouldn't be happening all the time either. Right? Like you shouldn't be doing research just for the sake of doing research. So I'm very curious about how priorities and strategy are being set at your company. Um, ideally we do go out there and do the validation. We do the research, we go out and we ask for budget and then we start building. And by the time we do that to the time we start testing, let's say, um, or validating our work, that could be several months for sure.

And maybe we start building soon after that, but you shouldn't be just out there kind of like, you know, exploring all these things for the sake of exploration. It should be with intent aligned to strategy, which makes me think that maybe your leaders are, you know, spinning up these projects for you to go out there and do all this validation work, but they won't align them back to the strategy, which is why you shelve them. And I've actually seen that happen a lot where it's like, Hey, we're gonna spin up these little teams over here. They're gonna do some like lean validation techniques. They're gonna do these ways of working. Sometimes they're called innovation teams. Sometimes they're not, they're just part of your team. Um, but the work that they were working on was never aligned back to the strategy. And because of that, those things don't come to fruition.

So this is actually a sign to me that there may be bigger issues at play here inside your organization. And it's not just, uh, you know, stale user research. It's, it's more of a, Hey, are we actually all swimming in the same direction? Are we aligned in our strategy? Is it deployed? Well, these are the types of things that would make me wanna go evaluate that and look into it. And I'd suggest you do that too. If you're finding that you're just picking up stuff that has been conducted two years ago frequently, or doing a lot of, uh, user research that just doesn't go anywhere. It's one thing to do user research and say, there is no problem for us to solve, or this is not the right problem to solve or we're too early. Okay, fine. That should happen. That can happen a lot and it should happen, but it should be like, we're never gonna build this or we're a little too early for that.

And if we're a little too early, there's usually not that many of those things. So that's why I'm like how much volume do you actually have of research and validation work, where you're just shelving it. And coming back years later, like that to me is a sign that your strategy is not well deployed or it's not well aligned. And we might just have teams keeping busy instead of working on strategic top priorities. So I'd look at that too. Um, but you wanna make sure all of your research is up to date and we wanna be following through when we do validate things to build it quickly so that we have those opportunities. So I kind of look at how you choose what work teams teams, uh, go around, right? Like how do we figure out when to do research? How do we figure out when to do validation?

And is it aligned big strategic objective? If it is aligned to a big strategic objective at the time, and they change their strategic objective at the top of the company, again, seen this a million times, that's also a bad sign. That means that the direction is not well set. It means that there's probably a lot of, you know, waffling at the leadership level as well. So you, you might not have strategy set at the right levels either. It might be too low level at the top. Those are all things I've seen actually happen a lot inside companies. Um, so I would say this is an inkling to me to really go back and check your strategy as well. Not just look at it as like, how do we refresh user research, but also why are we shelving all this stuff? Like why are we shelving things? And coming back to it two years later, that's a little strange. Um, so I really encourage you to dig into that. Maybe a little bit deeper too.

Right? Second question, dear Melissa. My question is regarding continuous discovery in customer interviews, I'm struggling to book video calls with customers to do discovery work. Ideally I wanna have at least two to three customer interviews per week. Please note that this is not usability testing. This is mainly opportunities, discovery discussions. I think good product managers should talk to their customers every week. Right? What are your advice for about booking customer interviews? All customers are busy and have no time for such calls. All right. So this one's interesting, like you do wanna be talking to your customers frequently, but I don't know if you have to actually talk to them every week. Uh, maybe somebody has said that in the past, uh, I think you should be in touch with your customers pretty frequently, but if you are hounding them every week, uh, they might not wanna sign up for these calls, especially if you're going after the same type of customer every single time or the same group of customers.

Like this is where they'll get a little, you know, waffly and they don't wanna sign up for stuff. Now, why are you talking to two to three customers per week? Right now you're talking about opportunity discovery discussions. Uh, you, you need intent whenever you go into a customer interview, right? It should be an intent to learn about their problem, an intent to learn about how they're using the products. But typically it's good to go in with like I'm exploring problems in this area, or I'm exploring these types of things and being open to the aha moments that do come with, you know, the unknown unknowns that will surface up during these calls as well. But I would say like the reason that companies get weird about doing customer research is because a lot of people will go do research for research is sake instead of research with the intent to learn about something specific that can help you.

So I don't know if you should be doing two to three customer interviews per week, just for the sake of it. Um, maybe you can learn the things that you wanna learn in different ways. Can you tag along on, uh, sales calls? Can you go through all of the, um, you know, notes from sales calls to see what problems are emerging? Can you talk to some people about that? Does this replace customer interviews at absolutely not, but I'm trying to figure out how do you get out of having to book two to three interviews every single week without an intent to learn something. So think about like other ways, you know, can you sit on support calls? Can you answer some calls? Can you do these types of things, one that puts you in the position of people actually having a problem and coming to your company to talk it or, um, understanding your product better, all of those different things.

Now, when you are ready to start doing like discovery around a problem discovery to see if that problem exists, that's where you do wanna line up all of these customer interviews, but you wanna be very intentional about what you're trying to get out of them and what you're to learn. And that's probably, you know, why they're being wishy-washy and saying they don't have such time for such calls. Like if you come to them instead and say like, Hey, we've been getting a lot of feedback from our customers that this is a problem. Is this a problem for you? And if so, um, you know, can we, can we have a conversation about it? They're probably gonna be more likely to book that time with you because they want you to fix it. Like, yeah, I'm actually experiencing some of these things. I understand what you're trying to learn, but if you're just saying like, Hey, I wanna get on a call with you.

They're gonna be like, why? Like, what's, what's the point of this? What's the objective. So I'd really re uh, like, remember that continuous discovery to me too. And I know Theresa has, you know, a lot of stuff on here, so you should go read whatever she thinks about it. But continuous discovery for me is like constantly keeping in touch with your customers to understand what their problems are that are emerging. But that doesn't mean that you just have to do generative research all the time. Right? Generative research is great when you're looking for opportunities to actually solve. But I have a question of why you're doing that every single week. Like, why are you doing generative research every single week to identify these, uh, opportunities? It's okay to do these things periodically, I believe. Um, but there's also tons of ways to learn about what problems you could potentially be solving with your customer.

Again, this also, I don't wanna like super harp on strategy, but this also does make me think about like, you know, what are you aligned to? Like, what's your big product initiative that you guys are going after. If you have to go do generative research every week to figure out what to solve, like, is there a good sense of what the problems are that the company should be going after? Is there, um, you know, more direction coming down from the leadership team about what they wanna concentrate on and go after those are the types of things I would look at as well. If I was in your shoes, like how do we make sure that we are all moving the same direction, but you might have that more defined and that might come through better if I were to talk to you than it does in this question.

Uh, but those are questions that I would have, right. About your, your company and what you're doing. So when you are booking customer interviews, though, and you do wanna do them, I'm not saying don't do them. I'm just giving you some ways to, you know, diversify your inputs from data. Let's put it that way. Uh, how do you get them to actually make time for the calls? You gotta speak to their needs. What are they gonna get out of it? Um, we used to do, you know, uh, a lot of recruiting for customer interviews and you have to basically send out emails to a ton of people. You have to like either establish relationships with customers or, you know, spray and pray to get them to book like that. That's just the nature of it. Uh, a lot of them will say they're busy, so you're just gonna be doing this quite a bit.

Can you, do you have any user research teams, um, that you can go, you know, get help from craft an email that's really compelling make the email sound like when they come to this customer interview as well, they'll be giving like tons of feedback to your company. Um, these are the things that you use to actually build the product. That type of language usually goes over pretty well in your research requests. Um, so I would try to make sure that it's worded, not just as like, Hey, can you help me out for a minute? But more of like, what's in it for them. , you know, long term short term, make sure that you're really speaking their language. There usually gets people to convert a little bit better. I, another thing, if you do have any, um, sales people, if you're in B2B, I don't know if you are, um, if you have any sales people account people, can you go talk to them and see if any of these people are having problems right now, and you can talk to their customers, right?

Uh, if you have a good relationship with the sales and account team that works well. I know a lot of people think the sales and account teams are out to get them, but in a very healthy product organization, we all work really well together. You may also wanna go ask customer support. Hey, did anybody reach out recently about these types of problems? They can filter through their support tickets, see if they are there, and then you can contact those people too, to follow up more. That's a great way to get somebody to commit to a customer interview, too. If you're looking through their problems and you follow up on their problem, they're gonna wanna be on the phone with you talking about what happened, and then you can dive into other conversations from there as well. So I hope that helps, the message here is like really makes you sure there's some intent for research, right?

Like the word intent is important, intentional research. Like we are going out to discover problems. We're not just talking for the sake of talking. I don't know. Um, once a week for talking to customers could be great for product managers. Um, but there's a lot of ways to learn about what your customers need. And I just wanna make sure that whatever you're doing here too, uh, when you say like opportunities, discovery discussions, like do you, did you narrow it down to like something that you wanna explore? Are you even targeting the right persona? Uh, or are you just going in of like, I wanna talk to people for the sake of talking to people, uh, come up with more creative ways than just doing it customer interview wise, too. Hey, also, you know, doing webinars around topics, stuff like that. That's a great way to get people to show up. You can get your customers to show up, teach 'em something, talk to them, see what questions they have around topics. That's not a bad way to think about it either.

All right, last question. I'm a product manager stepping into a new role in our innovation team. In my new role, I'll be looking at supporting product trials for new products, gaining quality customer insights through trial will be key. Do you have any tips beyond the usual customer surveys and interviews for understanding product success? So, one of my biggest pieces of advice here, um, especially if you're on innovation team, I I've got a couple things here. Uh, but one, you need to deeply understand your customers before you build your product. So make sure that you are doing the right research up front. You're getting good research. You are, you know, really making sure you deeply understand the customer through ethnographic research before you build your product.

So don't wait for product trials to get your first moment of feedback and your first understanding of the user, like do that first. All right, now we're gonna talk about actually testing your product. So when you go into product trials, you're gonna wanna cultivate a cohort of users to get both qualitative feedback, and then you're gonna try to quantify it. But qualitative feedback usually rules when you're not at scale, which is the zero to one new product, you know, product stuff. So you wanna make sure you're optimizing for qualitative feedback as much as possible. Uh, the best way to set up your trials is to make sure that you have people opted in and willing to try it. And who have that problem who have been vetted, who are the right persona for this thing before you even start building it, like make sure that people are there and signed up and committed to using this and giving you feedback then before you start building as well.

And like a lot of this stuff actually starts before you start trialing it. How are you gonna measure success? Right? What is the behavior change of users that you would measure? Did they, you know, do, were they doing X and you wanna see Y happen afterwards? Like that's something definitely to write down and start to think about how to quantify, um, how will their lives change? Uh, what feedback do you need that tells you that this is a success for them? And it's helping them achieve things that they weren't able to achieve before. You're gonna wanna write down to like things you can observe objectively about them using your product, but also things that they say. And you wanna go into those trials, listening for those things and making sure that they surface up, but also having an open mind that, Hey, the first thing you put out there is probably not gonna be amazing uh, how do we tweak it?

How do we go back until people are like, yeah, this is great. So, um, definitely be open minded, be willing to change and tweak things. If you go in there and it's not really showing up wonderful, uh, customer surveys, I honestly wouldn't even do a customer survey in a product trial. You're gonna have such few people using it. You're gonna to do a lot more qualitative interviews, but I'm, I'm gonna wanna watch somebody using this new product too, right. It's not just about interviewing them like, Hey, did you like it? It's like, show me how you use it. Uh, can you shadow some of these people? Can they walk you through it on zoom? Can you ask them about like, what's different in their lives? These are the types of things that are really gonna show you that the product is successful or not. And then maybe you wanna have like a sales pitch once it gets like a little closer to being a success.

Um, when you're feeling really good about it, you can say like, would you wanna sign up for this? You know, now, and we'll give you first access when we build it. Those types of questions will help really show if people are willing to commit. And that's what you wanna look at for product success. Like how willing are they to put something on the line that says, they're, they're gonna commit to using it or buying it or whatnot when they get up there, how sad would they be if you took that away? Another great question to ask. And if people would be extremely sad, if this thing never launched or whatever, that's a great way to actually measure product success too. So you're really gonna wanna think about like building that great cohort of people, getting the trial users in first, watching them as much as possible, and then really measuring behavior changes.

Uh, that's the big thing, make sure that you're looking at what are they doing differently with their product? How is that contributing to the value that they wanna, he from it? And what do you need to tweak in order to get it to be more valuable? That's what I would really look at for running your trials too. Um, also make sure that you have the right people in the trial really important, right? Persona, right? People, people are willing to commit to it and give you feedback. Like make all that known upfront before you even start. That's gonna save you a lot of headaches in the long run as well. All right. That's it for dear Melissa this week. If you have any questions for me, again, reminder, please go to dear melissa.com. Let me know what you're thinking next week. We'll be back with another product thinking episode and a guest on Wednesday. We'll see you then.

Guest User