Episode 207: Mastering Predictable Product Success with the ODI Strategy with Tony Ulwick
In this episode of the Product Thinking Podcast, I’m excited to welcome Tony Ulwick, the founder and CEO of Strategyn, and the pioneer behind the jobs-to-be-done theory and Outcome-Driven Innovation (ODI). Tony shares insights from his groundbreaking work in redefining product development and innovation. He discusses the importance of separating innovation from development processes and why focusing on customer outcomes rather than features leads to more successful products.
Whether you’re a product manager looking to refine your approach or a leader aiming to align teams around customer needs, this episode is a treasure trove of practical guidance and deep insights into mastering the art of innovation.
Tune in to discover how you can leverage ODI to drive your product strategy towards success.
You’ll hear us talk about:
14:22 - Front-End Innovation vs. Development Process
Tony explains the importance of distinguishing between the innovation process—where the product concept is formed—and the development process, where Agile methods come into play. This separation ensures that only market-winning concepts proceed to development.
19:41 - Understanding Customer Needs
Tony outlines the steps in Outcome-Driven Innovation, focusing on identifying and addressing customer needs to enhance product success. He emphasizes defining markets and needs before devising solutions.
28:06 - Aligning Teams on Customer Needs
Tony shares strategies to encourage organizational alignment on the definition and prioritization of customer needs, ensuring all teams work towards a common goal.
Episode Resources:
Other Resources:
Follow/Subscribe Now:
Facebook | Instagram | Twitter | LinkedIn
Episode Transcript:
[00:00:00] Melissa Perri: Hello, and welcome to another episode of the product thinking podcast. Joining us today is Tony Ulwick, the founder and CEO of Strategyn and a visionary in the fields of innovation and product development. Known as a pioneer of the jobs to be done theory and inventor of outcome driven innovation ODI, Tony has transformed how companies understand and meet customer needs.
[00:00:20] His methodologies, which have been widely adopted by leading companies like Microsoft and Johnson & Johnson, aim to take the guesswork out of product development and enable more predictable success. With a career that began at IBM, where he witnessed firsthand the challenges of product failure, Tony has dedicated himself to reframing innovation as a science.
[00:00:39] Today, he'll share insights on turning failures into learning moments, the journey to creating ODI and his perspective on the future of innovation and customer centric product strategy. But before we dive into it with Tony, it's time for dear Melissa. So this is a segment of the show where you can ask me any of your burning product management questions.
[00:00:56] Go to dearmelissa.com and let me know what they are. Here's this week's question.
[00:01:01] Dear Melissa, my question relates to the transition many companies, teams, and individuals face these days. Going from output to outcome. However, my organization will still do classic projects and only a couple of departments will be going full product.
[00:01:16] This seems a bit counterintuitive to me. How can we make the transition and tap into the potential of product operating model if we know from the start that not everyone will be following along?
[00:01:26] So this is pretty common where a lot of companies might start with one part of the organization and say, we're going to move this to a product operating model, but other parts of the organization might still be working in project.
[00:01:37] This shift is always a learning journey and it usually takes many years, but it's going to be difficult along the way. What you'll probably observe is that within your team, you'll be able to work in a product operating model, meaning you'll be able to talk to customers, talk about metrics, try to figure out what's the right thing to build.
[00:01:56] What's going to be hard is getting leadership aligned. So leadership is usually going to be thinking about project management style of we do this project and we move on. Why should we fund this stuff for longer? Instead of the product operating model and what you need to do is get really good at communicating where your efforts are, what you're trying to really prove here. Why it's the most important thing you could be doing and how the product will be evolving over time. It's not just one and done, why you need to invest in and actually grow. What works here is trying to explain to the concept that to leadership, that when you think about products, we don't want to treat them just like projects because we'll be spinning up many different products that might solve the same problem.
[00:02:37] And instead we can lower our footprint of software and our cost. If we think about our products as expandable pieces of software that solve the needs of many. So that we get leverage from it. So you may want to start there and align on what the concept is of what a product is and why you're managing a product and why you should see it over time, right?
[00:02:57] It's not going to go away. It's going to keep solving problems. And you want to make sure that's strategic to. How they see that's connected back to the company strategy and that it will help it grow. When you're in this kind of mode, a lot of it comes down to communication. Unfortunately, it just takes more work to communicate up, to explain why you work the way you do to explain like, Hey, we do discovery processes. We're not going to just dive into this. It's not going to be managed by specific hard dates right here and get away from some of the misconceptions with the project. Maybe you. Talk about how your teams are product teams and not project teams. Maybe you get in front of some of those issues ahead of time to make sure that there's clarity there.
[00:03:38] That's what I would advise for, but there will be growing pains along the way. Let's say that. As an organization, it's going to be messy for a while, but I've seen many organizations go through this transition. Just a couple of teams will start with product. They'll spin it up. They'll try to see what's going on.
[00:03:52] You can still make your team. If it's doing product, a premier team and an example of what success looks like, and that's what I would focus on. How do I show? That by this way of working, we're going to be more successful, we're going to create better products that people love and then I'm going to tell my success stories to the company.
[00:04:09] So they get really excited about it and more people want to move into a product operating model. The other thing to acknowledge here too, is that not everything needs to be a product, right? So there are some things that are just not productized. If we're talking about software products that you sell to customers, like sure, we want to move those teams into a product operating model, but not everything across an organization needs to be a product.
[00:04:30] So let's be very clear about what is a good candidate and what is not. Like what teams should be operating in product and what teams should not be operating in product. So you want to make sure that you're setting a good example for the rest of the organization so that it will be easier for them to jump on board.
[00:04:47] And then you can tackle the really hard problems like budgeting strategy, deployment, alignment across organizations, and then cross functional working relationships. I hope that helps and good luck on that journey. Now it's time to go talk to Tony.
[00:05:01] Welcome to the podcast, Tony. It's great to have you here.
[00:05:04] Tony Ulwick: Melissa, thanks for the invite. I appreciate it.
[00:05:06] Melissa Perri: I've been a longtime follower of you, Tony, for a long time and your strategy and journey before you started Strategyn. Tell me a little bit about your career, what led you to wanting to go into this type of work?
[00:05:18] Tony Ulwick: Yeah, it's a good story because I experienced one of the most humiliating product failures of the decade back then. So I was working for IBM at the time. I was working in their PC division and working on a product that was called the PC Junior. It was supposed to change the way people did home computing.
[00:05:40] It was supposed to beat Apple. It was a huge endeavor. IBM invested about a billion dollars in the project. And instead of it being successful, the very day after it was introduced, the headlines in the Wall Street Journal read, the PCjr is a flop. And of course, we didn't believe it. But we did. But they were right.
[00:06:03] It was a flop. It took about a year for us to come to grips with that reality and it cost IBM a tremendous amount of money. And this was my first experience with the product. I thought products, especially coming from a company like IBM. Of course, it was going to be successful. But as I started studying this, I wondered, how did IBM fail so miserably?
[00:06:29] It's not just IBM that fails miserably. Obviously, I really realized very quickly that many companies have Big failures like that. And the funny thing is, it's not just back then in the 1980s. It's still today. The same thing is still happening. So this has been an issue for years and it got me very interested in innovation as a process.
[00:06:50] I really wondered, how did they know that this was a flop the very day after we introduced it? They were using some criteria to judge the value of the product that we didn't know about, right? Otherwise we would have, Built the product to meet that criteria and would have been successful.
[00:07:08] And it got me thinking about, is it possible to know in advance how people are going to measure the value of your product?
[00:07:16] And I didn't have the answer for that, but that was what I set out to do. And I spent my last. Five or six years at IBM, trying various methods of innovation practice. Back then it was QFD and house of quality and voice of customer and conjoint analysis and a few things like that were coming together.
[00:07:35] But I realized quickly there was no real process for innovation. And set out to create one.
[00:07:42] Melissa Perri: So when you were looking back at all of these things at IBM and how people did innovation. What did you do to come up with what you call the outcome driven innovation, right? That process.
[00:07:54] Tony Ulwick: as I was looking at different ways to innovate, I realized there was no process there. But It occurred to me that, Levitt gave us a nice hint. He said, people don't want the quarter inch drill, they want the quarter inch hole. And then the engineer in me took over and said, Hey, I have an idea. If we study the process of creating a quarter inch hole, we could treat it like we do in a manufacturing line, like you study the manufacturing process. You try to figure out what do we have to measure and control to produce a predictable output. And we automate things and we do statistical process control to remove variation from the process. And we do Six Sigma to eliminate defects from the process. And that was my career, my early career at IBM.
[00:08:42] And I thought if we study the job of creating the quarter inch hole is a process. We can apply all that same thinking and we can figure out how to get the job done better, faster, more predictably, higher output, throughput.
[00:08:54] And so that was the beginning of it. And I thought, will people be able to tell us how they measure
[00:08:59] success? And turns out, they do. And we've worked on that language over the years. We call them the customer's desired outcomes.
[00:09:08] We all use them. For example, as you're
[00:09:10] preparing a meal you think about things like, maybe you overcooked the meal, and you'd say, I don't want to do that, right?
[00:09:17] I'd really like to minimize the likelihood of overcooking the meal. Or, God, this is cooking unevenly. No, I'd like to minimize the likelihood of cooking the food unevenly. Minimize the time it takes to prep the food. Minimize the likelihood of creating the wrong portions. There'd be a whole bunch of metrics that you'd consider to try to prepare the meal perfectly.
[00:09:38] So the thought was we can go capture those inputs,
[00:09:42] We can figure out what the needs are in a given market, if you will. And go from there to figure out what you're unmet, to what degree, are there segments of people with different unmet needs, build the strategy around creating solutions that get the job done better.
[00:09:58] So it I thought of it in a flash and I didn't know if it was going to work. So that was the next step is to go try it. And it turns out it works quite nicely and we've been perfecting it ever since.
[00:10:11] Melissa Perri: What did you do in early days to test it out and see how it was working in organizations?
[00:10:17] Tony Ulwick: I practiced at IBM. They were great to work with. Like an internal consultant and I did projects around the globe. Actually, I spent time in Australia for a month and a half on assignment applying the approach to a business unit there Japan, France and the US itself. And so I had about two years experience just getting it ready.
[00:10:40] And then I decided in 1991 to start my own business and started advertising, getting some clients set up. And I left IBM and started the practice and the very first project that I had executed on, I was with Quartus Corporation and they were on their last leg in terms of their angioplasty balloon product line.
[00:11:05] They had 1 percent market share. And they needed a win. We tried it out. And one thing that was very encouraging when I was in there interviewing with them to take on this work, there were some nurses in the room next door.
[00:11:18] And they said, would you come just apply this process next door? Just show us how this works.
[00:11:23] Go talk to those nurses and get these inputs that you're talking about. And I said, okay what products are we talking about? And they said, The sheath introducer. Now, I didn't know what a sheath introducer was but it didn't stop me. I went in and started asking them about what are they trying to achieve with this sheath introducer?
[00:11:42] And they told me, and then we started getting into discussion about what are they trying to achieve in the very first step of using it?
[00:11:48] What are they trying to avoid? And I went down this path and I started writing down these outcome statements. And at the end of the session, we spent maybe 10 minutes doing this.
[00:11:56] I collected maybe 10 or 15 outcome statements
[00:12:00] and they said, you should hire that guy. He knows your market better than you do. So I got the assignment and so that was the start, but the project turned out amazingly well. They introduced 19 new angioplasty billing products a year and a half later.
[00:12:16] They were all number one or two in the market.
[00:12:18] Their market share went to over 20%.
[00:12:21] They also discovered a huge unmet need in the market using the process,
[00:12:26] which was minimize the likelihood of recent restenosis, which is the recurrence of the blockage. It was off the chart. It was a very important poorly underserved outcome.
[00:12:36] And they said we have a product that could probably address that. We're working on it in the lab with 30 other things and we said you should triple down your resources and be first to market with that product,
[00:12:49] Which they did and they succeeded. That product was the HeartStent and that became a billion dollar business in less than two years. And, their stock went from 8 to 108. It was just a great all around story, very encouraging. So I could see the process. Working and how to make it work. Then the question was, can we make it work in all industries or more industries other than just the medical space? And I spent the first 10 years, 1991 from until about 2001, pretty much testing this in different companies as a loan practitioner.
[00:13:29] And then in 2002, I was published in the Harvard Business Review, that case study, that it featured the quarter story and from then on I got way too busy and had to start a company and learn how to run and manage a consulting firm.
[00:13:46] Melissa Perri: For, so it's always funny how that ends up. It's Oh, I'm too busy. Now I gotta be busier trying to run a company too, but it's a testament to that, this works and that it's helping people. So when you talk about outcome driven innovation, what I really love about it too, is how you contrast it with other types of processes like agile development, processes, design thinking. What have you observed coming into companies and helping them about how the misconceptions around when these different processes are needed holds them back and where do you see ODI fitting in?
[00:14:22] Tony Ulwick: I think the key factor that contributes to confusion is the overlap of the innovation process and the development process. I call the innovation process the front end of innovation, which is ,really everything before development, and the output of that process should be a product concept that, with a high degree of certainty is going to win in the market.
[00:14:42] Before you start developing it, that's the ideal goal, right? So you shouldn't start developing the PCjr and watch it fail at the end. You should know right up front it's going to win or lose and don't do it. Separating them out is important. Most companies think of them as one and the same. They think the innovation process starts with the idea, and stops at product launch. I'm just separating front end of innovation, which is come up with a product concept from developing the product. If you want to develop the product, you want to make sure that the product is the product that's going to win in the market. That's what ODI does on the front end. That's when Agile kicks in, in development. So use Agile techniques to develop. And the way I like thinking about it is, ODI makes Agile more Agile. Because you're not iterating on what the product does while you're creating it. You're only iterating on the product design. So that it will perform as nicely as possible.
[00:15:38] So separating those two out makes great sense. I view ODI plus Agile as the great combination for the front end of innovation and development. Then there's other techniques like Main Startup, for example, that ask you to hypothesize the market, the customer needs, and the solution all at once. That's intertwining everything, right?
[00:16:03] You're trying to solve a very complex equation. And what we do instead is we first define what market we want to go after. Then we define the unmet needs in the market. Then we come up with the solutions that best address those unmet needs. So we solve the equation by really setting up constants in the equation. The market becomes a constant, then the needs become a constant, and then you can come up with the solutions that address them. It's like solving a simultaneous algebraic equation, right? You solve for the constants in the equation. You can't solve for all the variables at once. We discount the Lean Startup methodology because it causes you to iterate incessantly and you may never get out of that loop.
[00:16:50] So the tools like design thinking, but design thinking really isn't a process. It is a set of tools that can be used throughout the development process. That's how it was initiated. Could you also use some of those tools for the innovation process? Yes, you can. And some people have confused the two again and said, design thinking is all about innovation and development. But really, it's a great set of tools for the development process, but it's not a great set of tools for the innovation process. It doesn't produce a product concept that you know will win in the market before development begins.
[00:17:29] But ODI does. That's the difference.
[00:17:31] Melissa Perri: What I really like about ODI too, is what you just said, did we start with that market piece? And where I see a lot of people mess up, I think when it comes to innovation or especially people in startups, right? Is they go into markets, or they're trying to solve a problem of an area they don't know a lot about right.
[00:17:47] And I've seen a lot of even like, when I was teaching at HBS, I saw a lot of students be like, I'm going to go build something to do X in this market. And I was like: Do you know anything about that market? And they're like: No, but it's it's something I want to go after. There's a problem I want to solve there. And they're not really looking at the landscape or what. They should be tackling or if they want to really tackle that market without identifying it. And I think that's really interesting about ODI that you start there. Like I've seen that common mistake. Why do you suggest starting at the market for most companies or for most startups?
[00:18:20] Tony Ulwick: Yeah, we always start at the market because we need to know who our customer is that we're trying to create value for. That's the first question, who, who are we trying to create value for? And then the second question is, what job are they trying to get done? So it could be an interventional cardiologist trying to restore blood flow to an artery. We need to know who we're targeting, what job they're trying to get done. And notice I'm saying it like that too. I'm not saying what job the product gets done. It's, I'm targeting a group of people and they're trying to get a job done. What we see is most product concepts, especially in startups, get a tiny piece of the actual customer's job done.
[00:19:02] You're not going to hit a home run. You're a feature on a platform. And you're never going to take over the world by pursuing just a feature on a platform. You want to figure out what is the entire job they're trying to get done and work over the years to create an end to end solution that gets the entire job done. And unless you know what that entire job is, it's going to be very hard to figure it out. You'll eventually get there and markets evolve in that way naturally over time anyway through guesswork. But if you know the end game right up front, you know the entire job, you can lay out a plan to get the entire job done in the most efficient manner, which is what we propose.
[00:19:41] And that's the most efficient path to growth.
[00:19:44] Melissa Perri: So in the ODI model you were talking about in the process you were talking a little bit about how you don't do the actual building, let's say. Yeah. And that's more for the development process. That's the iteration process. When we look at lean startup as well, what are the steps to ODI and what are the types of activities that people should be doing during those steps?
[00:20:04] Tony Ulwick: Yeah, that's a great question. So the first step is to find the market to find the group of people you're trying to create value for and the job they're trying to get done. Now, the way we do that is we talk to the customer, that's why it's important to know the group of people you're trying to create value for, because you've got to get out and talk to them and get from them, their view of what they're trying to accomplish.
[00:20:23] So that becomes your market definition. The second step is to understand the customer's needs. Now, we're going to define needs as the metrics people use to measure success when getting the job done minimize the likelihood of overcooking the food or minimize the likelihood of cooking it unevenly.
[00:20:39] There's typically anywhere from 75 to 125 of these metrics for any given market. And we uncaptured, we capture those by again, talking to customers, job executors, as we call them. They have experience executing the job, right? So they, they've, like in the case of interventional cardiologists trying to restore blood flow, in an artery, they've tried to do that hundreds of times. Okay. So as you're asking them about these needs statements, they can tell you, Oh, in this stage, I'm, I need to do this. Then I need to do that. I got to watch out and avoid that. I got to eliminate that defect. And they can go through and tell you all the metrics they're using to measure success. We train people to think like this and to capture inputs in that manner.
[00:21:30] The outcome statements have a very specific structure and syntax and format. And we've learned over the years that they need to have that format in order to be useful statements from the beginning of the process. to the end. The inputs that come from the customer run through the organization. They have to inform sales, marketing, development, R& D, one set of statements that everyone can agree to.
[00:21:58] And in most companies, like we, we pull all the time, we ask this question, is there agreement in your organization as to what a customer need is? And 90 percent of product teams say no. We don't even agree on what a need is. Nevermind what the needs are, which needs are unmet. We can't even agree what a need is.
[00:22:17] We've also polled 12 different experts in voice of customer did this years back, and we have 12 different definitions of what a need is. So this has been an issue for years. Somebody has to decide. What a need is, right? So we look at it like, what should a need be? It should be instruction and instruction that comes from a customer that tells you how to get the job done faster, more predictably or without defects. Which is the goal of getting the job done better. So once we lay all this out, we can come up with this particular format and structure and be sure that we're getting inputs that are going to lead to success. So that's the second step. Very important step, obviously. The third step is we want to figure out which needs are unmet and to what degree. So there may be 10 of those 1, 700 needs that are really important and poorly satisfied. Today and we want to discover which those, which of those needs are so we do quantitative research. We'll put a survey together that will go out to hundreds of people and we ask them to tell us the importance of these outcomes.
[00:23:28] The last time they were getting the job done and how satisfied they were. with their ability to achieve that outcome using whatever solution they were using. And we ask them what solution that is too so we know the answer to that question. So with that information we can start doing our data analysis to figure out are there any needs that are unmet across the entire market?
[00:23:50] Are there needs that are unique to segments? What are the steps the fourth step in the process as part of that analysis is what we call outcome based segmentation. We want to know are there segments of people with different unmet needs? And in all the studies that we've done over the years, there's always segments of people with different unmet needs.
[00:24:10] In other words, people don't agree on which needs are unmet. The best way to discover those segments is not by segmenting around demographics or psychographics, but segmenting around the unmet needs. Now most companies can't do that because, They don't even agree on what a need is or what the needs are, which needs are unmet, but we've fixed all that and now we can discover segments of people with different unmet needs.
[00:24:35] So for example, when we helped Bosch into the North American Circus on market, they were trying to compete with, deWalt, Makita. They wanted to come up with a solution, a premium brand that would get the job done better at the same price point.
[00:24:50] To make that happen, we had to find opportunities to get the job done better. When we looked at the broad market, it looked like the market, everybody in total, it looked like there were no opportunities. But when we segmented the market, we found a third of the population that had 14 unmet needs. They were more finished carpenters, they had to make more angle cuts, blade height adjustments, and they had 14 unmet needs that nobody else had.
[00:25:17] So that became their target. So they came up with a solution that satisfied those 14 unmet needs. And that was their best selling Circulus on North America for about 10 years or so. That's why the segmentation aspect is so critical, because if you build for the average, you're targeting nobody, usually ineffectively.
[00:25:41] Then, the last step is to use all the information to come up with a solution. The product concept, which we do through ideation. So in the case of Bosch again, when we presented those 14, unmet needs to the engineers. It took them 3 hours to come up with solutions to address all of them. And as they said, it's not as if we hadn't had these ideas before.
[00:26:03] The problem is we've had. All these ideas before and more. We've had thousands of ideas. We just didn't know that these were the 14 that we're really going to create the most value for the customer. So it's all part of getting teams on the right path and the same path. It's a way like thinking about it.
[00:26:21] They have to head in the right direction. So everyone's creating value for the customer in the most efficient manner. And everyone has to believe it. And these are the two key elements of our approach that, that have to come together for a company to be successful. They got to be heading in the right direction and everyone has to be paddling in the same direction. And to achieve both of those is the magic of the innovation process right there. If you can do that effectively, you're going to be successful.
[00:26:53] Melissa Perri: When you think about these companies that are not agreeing on what a need is or what it means to them, what do you do to try to encourage them to align on what those needs are? Like, how do you start that conversation or go through that work?
[00:27:06] Tony Ulwick: Yeah, you know that, I love that question because yeah, we've just learned recently that the best way to do that is to ask them how do they define needs? And we do individual exercises. We may do workshops. It's fun. Everyone writes down how they think about a need. Oh, is it a specification, a requirement, a pain, a gain, an exciter, a delighter, a value driver? I could go on. There's 30 terms that we hear that people use to state what a need is. And then if we go into saying write a need statement. Now, then compare. Everyone compares what they put up there. And they see that there is disagreement. Great.
[00:27:49] Proving to them that you don't agree on this basic thing. You're all here to come up with solutions that address unmet needs and we can't even agree on what an unmet need is or what a need is. So that's the first step and then offering the solution of course is the key.
[00:28:06] Why is this a need? Because people buy products to get a job done and if we can help them get it done better, they'll buy our product.
[00:28:13] Let's tie needs to statements that show how to get the job done better. It's that simple, right? The logic's there. And if we can get the team to follow that path.
[00:28:27] And the funny thing is, they don't have to know how to do ODI and how to talk to customers. They just have to know that once they get that information, that's the information they should use. to drive their product decisions.
[00:28:41] Melissa Perri: When it comes to needs and outcomes, I feel like everybody. In every organization now is we're an outcome driven organization. We got to be concentrating on outcomes and I feel a lot of organizations aren't doing it well, or they're getting lost about what an outcome really means. How do you tie in needs back to outcomes?
[00:28:59] Like, how do you make sure that you are actually writing good outcomes, focusing on the right outcomes and tying that back to what those unmet needs are?
[00:29:08] Tony Ulwick: See, I don't like to use the word, if I could eliminate the word needs, I would, right? Outcomes are the needs. And so people buying products to get a job done, that's the market. They go through steps along the way. That's what we call the job map. And then they use these metrics to judge success in getting the job done. Those are their outcomes. You could argue that all of those are needs at some level. I don't even like to make that argument because it doesn't matter. The point is, Just focus on the outcomes tied to getting the job done better. And that's going to help you create products that will get the job done better. And we're really, Vocabulary here is so important. We put a glossary of terms together that that we like to use that shows how all these pieces fit together. If you stay in that mindset, you can define adjacent markets differently. You can define segments differently, and they all make sense through that lens. And once you get that thinking in your head, that mindset shift, then it's really hard to see it any other way.
[00:30:18] Melissa Perri: You talked a little bit too, about how you have a very specific way of writing outcome statements. What have you found is required in a good outcome statement that makes sure it's In the right direction.
[00:30:30] Tony Ulwick: Yeah. So it contains four pieces or elements. One is a direction of improvement. Which is always minimize. So it's going to be either minimize the time it takes to do something or minimize the likelihood of some bad thing happening. So that's the second part. It's a metric, either time or likelihood.
[00:30:49] Now, years ago we used many other metrics, but we've realized that these two really are The most useful and in terms of describing what customers are trying to achieve because they either try to get something done faster, more predictably or without defect. Now, the faster it is tied to time, the more predictably and without defects are tied to the likelihood of getting something done.
[00:31:13] You doing something wrong when executing the job or the product doing something unpredictable when using the product. And that's it. So now we're going to talk about minimize the time it takes to do something in some context. So the third statement is the object of control. What are we trying to control in the process? Minimize the likelihood of overcooking. the food. We're going to try to control the overcooking aspect. And then the last part is a contextual clarifier if it's needed. So you can talk about what context are you referring to? An interventional cardiologist may want to say, I want to make my way through a tortuous vessel or minimize the likelihood of impacting the side vessel.
[00:32:03] when I'm making my way through a tortuous path or something like that. Supplying some context to the statement so you know the situation they're struggling with that particular outcome. So those are the four pieces. We tested this back in the early 2000s with Microsoft. We worked with them on about 45 projects and most of the surveys we would use different variations of statements to see what happened. So we tested them and we found that the structure that we use today is a structure that gives us the best insights, gives us the best discrimination, causes the least fatigue, all the key things that you look for when you're trying to create a good survey. We often say the process has been battle tested. It really has. We've worked with lots of very smart people from the best companies on earth over the years who've helped us refine the process to make it even more effective.
[00:33:05] Melissa Perri: When you're looking at ODI and for companies that maybe are not engaging with you, but they're trying to learn ODI and bring it in. What are some common mistakes that you see them make or misconceptions they have about it?
[00:33:16] Tony Ulwick: I say that the biggest misconception is they think it's going to be easy. Because it sounds easy, but in practice it's not that easy and you need different people with different skill sets to execute the process. You need someone who can manage an entire project and even know how to frame it. Who is our customer and what do we, how do we aim ODI at a market, right? Just to think through that. It can be very challenging. We need someone that has to collect all the outcomes, customers. You need good qualitative research skills to make that happen. We need people to build surveys, collect data, run segmentation, analyses, and others. And so we need good quantitative researchers and then you need people that can pull the story together, read the data, understand what it means and turn it into a strategy. So it's not as if any one person in the organization can do all those things.
[00:34:16] So having a team of people that can get it done and done with the responsibilities and just being prepared for what it takes to get good at it. You can read the book and you can see how logically, it makes sense, but there's just a lot of little nuances in making it happen that you go through the learning learning curve. So that's probably the biggest issue I'd say.
[00:34:39] Melissa Perri: When you're thinking about who should be doing ODI to in organizations I see this a lot like an agile transmissions where people go like the leaders go, Oh, this is like responsibility of the team. I don't need to know this. Just. Go do it. And then I'll set these goals up here. Who should be involved in the ODI process from like a leveling standpoint?
[00:34:59] We just went through roles, but are you like a VP of product overseeing this or like a C suite executive, or is this for like directors of product or?
[00:35:07] Tony Ulwick: Yeah, you should oversee a team of people. So it could be a business unit director who oversees a number of businesses. I see some companies will create a corporate team of ODI practitioners who get assigned to different business units. I've seen it done both ways effectively. But, what you shouldn't do is have all your product managers become experts at doing ODI. That's not right because they're not necessarily strategists and qualitative researchers and quantitative research and data analysts. Asking them to do that is, is really too much to ask. So that's why we say, set up a team. It could be under, the head of the business unit. It could be on, it could be a corporate team that gets used across business units. But think of it like that as opposed to teaching the product team, product manager or even all the people on the product team. They don't have to know how to do ODI.
[00:36:05] They should know what an outcome is. They should know what to do with those need statements once they see them at the end because they're going to use them in their workflows to make the same decisions they always make. But now they're going to be focused on the customer's outcomes, not some irrelevant piece of information or misleading piece of information. So I think those are some of the important intricacies that come into play.
[00:36:30] Melissa Perri: Yeah, to me, a lot of what you're talking about feels doing good research that informs our strategy of where to go right through this process
[00:36:38] and then narrowing it down. And I've seen in. in. organizations, like especially director level, let's say like people who. Who run a team of product managers, right? Who are working on more innovative types of products or net new things. I've seen them run effective ODI processes, but they've had to bring in other people. Like you're talking about on the team, right? Like they needed a, we'd call them like a product ops analyst type person to help with the data. They need the user researchers to go out and do that.
[00:37:04] And they're helping to oversee it and run it. Or even like the VPs building a little team underneath them to help figure out how to inform the strategy in product management. I see a real lack though of the work that you're talking about, right? Like actually going out, doing the sifting, trying to point at people in the right direction before we set the objective targets that we want to hit and then just dive into the product work.
[00:37:27] Tony Ulwick: and this is what companies hire us to do because it is a unique skill set. It's specialized. We've been through a lot of iterations of the process. What we've spent a lot of time on more recently is not making the process better because it's pretty good, but making it easier to install in a company, right? If you think of it like that, and with the advent of AI, it's interesting to see the possibilities of like synthetic outcome driven customers, for example, that. Could be created using the data I just described, but the front end could just be an interface where you could talk to this customer who's has all this information about the market based on the ODI data that we feed it plus other information that would be part of it. So a lot of fun possibilities here that we're thinking about working on to help get the job done even better.
[00:38:26] Melissa Perri: That's really cool. What so I like this kind of trend of looking at AI. In a lot of the innovation space these days too, I've heard some pushback from people and talking about how AI is going to get rid of the need to do some of the stuff that we have been doing around these processes testing experimentation, we've talked about like the cost to build things is faster. So people are saying, Oh, you don't really need to test products. Maybe you don't need to do as much research. You just build something, throw it out there and see if it works with all the work that you're doing. Do you think that the progressions we're making in software development with the smarter AI features, the faster it is to build things and the lower cost to build, do you think it's taking away some of the work that we need to do there? Or do you think it's only making it more possible to do more of it?
[00:39:14] Tony Ulwick: There's the inefficient way to develop things is to guess and test and iterate. That's not efficient. If you can do it fast, is it cost effective? It could be in software. Not all companies just develop software though. And even software companies need services and sometimes more than that. So it's not a good across the board philosophy to have. You're still guessing. The way I think about this is, what are the chances of you randomly coming up with a solution that addresses the top 15 unmet needs in the market if you don't know what they are? It's zero. It's just not going to happen. But, what are the chances of you creating that solution if you know exactly what those 15 unmet needs are in priority order? And the answer is about 86 percent because now you're just relying on your team to use their creativity to come up with solutions that address the needs that you know exist.
[00:40:11] And if you can do that, you're going to get the job done better and succeed in the market. So my question would be, why won't you just do it that way? You're going to win, companies like the medical space that have long product life cycles already know this, a lot of technology firms know this as well. It's in the software space where they're. They just think it's easier to guess and throw it out there and see what happens and get a response. And like I said, it may be cost efficient, but it's, you're still wasting your time. And you can still do it better by having the needs first approach or outcome driven approach as opposed to an ideas first approach.
[00:40:49] Melissa Perri: When you are looking at the advancement of technology, we talked a little bit about the AI stuff, but what are you excited about in the field of innovation and how do you think it's going to change?
[00:40:58] Tony Ulwick: It's changing rapidly. So a few megatrends, if you will, you talk about these, the ability to go capture outcomes and people don't want to capture outcomes. They don't want to do the quantitative research. What they really want is a way to query customers to get answers to questions like, should we go copy this feature that a competitor just put out because it's so valuable to customers and without it, we're going to lose market share. Thank you. I would love to know the answer to that question. Now, using ODI data, you can answer that question. So we just need to learn how to take that kind of command, turn it into the query that produces the answer. So those are some of the things that we're taking a look at. And I think that's going to bring more predictability to the process. But behind that data or behind that query in that database, there's going to be good data, right? You still have to be following some good practices, right? So I think that ODI is really well suited for it.
[00:42:09] AI application because it's built with a bunch of rules. It's a rules based discipline and so it's a natural fit for an AI application.
[00:42:20] Melissa Perri: I'm excited to see that evolve over time. Tony, thank You so much for being on the podcast with us. If people want to learn more about you and Strategyn where can they go?
[00:42:28] Tony Ulwick: You can head right to our website at strategyn.com or you could email me if you'd like at ulwick@strategyn.com.
[00:42:35] Melissa Perri: Awesome. And we will put those links in our show notes at the productthinkingpodcast.Com. Thank you so much for listening to the product thinking podcast. We'll be back next Wednesday with another amazing guest. And in the meantime, if you have any questions for me, go to dearmelissa.com and let me know what they are. We'll see you next time.