Episode 44: Testing Your Ideas with David Bland

David Bland is the founder of Precoil, a company that helps organizations find product market fit through assessing risk and experimentation, and the co-author of Testing Business Ideas. David joins Melissa Perri on this week’s Product Thinking Podcast to talk about how to identify your assumptions, experimenting within slower feedback cycles, the importance of aligned confidence, and how product leaders have to continuously walk the walk when it comes to experimentation and de-risk. 

Subscribe via your favorite platform:

 
 

Here are some key points you’ll hear Melissa and David talk about in this episode:

  • David talks about his professional background and how he first got started in the field of business testing. [1:49]

  • David’s framework that uses themes from design thinking to define risk and identify assumptions. Experiment in the areas where there is the least amount of evidence. [3:32]

  • Many product teams put too much emphasis on feasibility but they also need to focus on desirability. Talk to customers to figure out if they want the product itself; if they are, figure out cost and revenue. [4:46]

  • David advises product managers to start with the business model and understand it; that will inform the plan for how the business is going to make money and how the product is going to impact their business. [6:44]

  • "What are the leading indicators that would predict that someone's going to renew? You should be able to start thinking through what are these touchpoints that would lead to somebody renewing, and how do we remove the friction from that?” David tells Melissa. [8:28]

  • The biggest hurdle to experimentation is time. If you don't have time, you are going to take the easy route. The goal is not to run experiments. The goal is to de-risk what you're working on to make better investment decisions. [13:11]

  • If a company is in a check-the-box mentality, it's not in the right condition to learn experimentation. You need to think about how you're de-risking, and changing your mindset and approach to processes within your organizations. David talks about the way he's designed his training programs to help companies with this problem. [16:55]

  • Repetition is key as product leader. Don't stop talking about the way you want your teams to run because you think they no longer need to hear it. "It's part of your job as leaders to keep repeating this, and showing it, and enabling it and creating a culture and environment where people can work this way," David says. [19:38]

  • David talks about experimenting around product strategy from a higher level, what types of experiments he's seen at that level and what experiments he advises product leaders to run. [20:38]

  • One of the main problems with experimentation is that companies often fall into the realm of testing on their customers as opposed to testing with their customers. It should be about co-creation instead. [32:36]

  • If you focus on customer value, you don't always have to have a finished product. It can be a service. Once you're fulfilling a need for that customer, or solving a problem that's valuable to a customer, or performing a service, you can start charge for that service. [35:30]

  • David talks about companies that have been doing experimentation well. [38:00]

Resources

David Bland | LinkedIn | Twitter

Precoil

Transcript:

Melissa:
Hello and welcome to another episode of the product thinking podcast today, I'm joined by a very special guest to my friend, David bland, who wrote the book testing business ideas. Welcome David.
David:
Thank you, Melissa. Thank you for having me.
Melissa:
So David, can you tell a little bit about what you do to our audience?
David:
Yeah. I pretty much help people test the business ideas. It's super fun. It's usually, you know, with startups and big companies, if it's going from idea to product market fit and that messiness there, I help them sort of look at their risk and design experiments. So it's a lot of just workshops and coaching and helping people kind of put their test their ideas against reality.
Melissa:
Cool. And how'd you get started with that? What made you go specifically into the testing approach?
David:
It was a journey. I started off in startups for about 10 or 11 years and, uh, we did a lot of testing back then, but it wasn't very rigorous. And I learned a lot of things the hard way, you know, I spent nights and weekends building things that nobody cared about and I thought there has to be a better way. And so when I started learning about lean startup and more about design thinking, I thought, oh, we should bring that more upstream and see if we can help solve real problems. So I'm not wasting my time building things nobody wants. And, uh, you know, I moved out to the San Francisco bay area and I started helping companies to kind of do that. And I realized, wow, there's something here, you know, to help people early on. And it just stuck with me. I never could have dreamed I'd be doing this, uh, early on in my career and I've been doing it for the last 10 years. So it's been, it's been really enjoyable.
Melissa:
What I really love about your book too, is that, um, it goes through so many different ways to test business ideas, uh, and especially like your assumption mapping technique, where you get into identifying your risky assumptions. Um, I even use that with my students too, to kind of outline a part of the case at HBS. And they were like, oh, I didn't never really thought about it that way. Can you walk us through a little bit about how you think about identifying assumptions? How do you really define the risk about them and what that means for testing?
David:
Yeah, it was, it was sort of came out of a need where I was working with a large travel company in San Francisco and I got them really excited about experimentation and unfortunately they would just experiment on everything just to see what happens. And when I was talking to the VP of product there, she was like, Ooh, I'm not super excited that they're just running experiments. Like we need to really de-risk their new ideas. And, and so I came about this two by two, you know, working with Jeff and at lean UX and eventually, um, customized it quite a bit because, uh, that's what I do. And, uh, ended up pulling in a lot of these themes from design thinking. So desirability, viability and feasibility. And I thought, you know, if we can capture all three of those and talking about risk, we should get a balanced team in the room and be able to narrow in on like, what are the most important things where we have the least amount of evidence and if we're going to experiment, let's experiment there. And so I thought that was a focusing mechanism. I just use it for a conversation. You know, it's more of a structure, combination conversation. It's less about the map itself then more about, can you focus your team on what's this stuff, if, if proven wrong or entire thing falls apart. And so I use that as just the focusing mechanism and then we run our experiments from there.
Melissa:
Yeah. And you, you identify to the assumptions in three different categories. You're talking about like viability, feasibility and desirability. When you work with teams to go over those three and really identify assumptions, is there any one of those in particular, you find that it's harder for teams to come up with assumptions about or something they may be like neglect to think through.
David:
It's usually viability. Uh, I think, um, too, I noticed teams really focus too much on feasibility and pretty much you can build anything at this point. Um, you know, now granted there's some regulatory and governance, you know, when I'm working in the healthcare space or financial services, just because you can build it doesn't mean you'll be allowed to do it. So that, that is, that is a point, but I feel like that's what they're used to. They feel comfortable in that's their domain. They can just work through, can we do it? But, um, you know, you also need to talk to customers which is more desirability, and then you have to figure out the cost and the revenue and whether or not it's, you know, something viable for the business. And I feel like so often people go, well, that's not my problem. I just build it.

And someone else's problem to figure out the business model, but, but more and more often that needs to be a part of your problem. You know, you need to be thinking about it. Even when I work with R and D labs in Silicon valley, they're now being asked to figure out some of the business model before they just throw it over the wall. Right. And so I think viability, it's just really hard to think about how do we test pricing or how do we test and keep the costs low enough. So I find that viability is one of these things that teams are really struggled with, how to do, you know, just basic price testing and cost analysis.
Melissa:
Yeah. It's really interesting too, because I feel like maybe we swung this pendulum a little too far, cause you know, back in the day product managers and when we want us to build things, I feel like we went crazy building these business cases, right. That really focused on the viability of stuff. And can we make money, but not necessarily like maybe the feasibility of it. And like you said, to get, build it, all of it or the desirability. But now I feel like since it was pretty easy to build things and we all focus so much on an agile world, um, sometimes we leave out the entire business case period piece and I found a lot of product managers don't think about how the business is going to make money or how their product is going to impact their business. And they can't draw that line from, Hey, if we released this, this is what the impact will be. Do you find that anywhere in your teams?
David:
I, I do. And sometimes it's hard to trace the thread. I think a couple of things, one, we sort of vilified business cases and business plans and to an extent, I agree that there's somewhat theater, right. But I think if you start with the model and you understand the model, then that can inform a plan. So I don't think they're completely useless. I just think you have to understand the model first. And I think the second challenge is when I'm working with, like, let's say companies like Adobe in Silicon valley, you know, if we launch a mobile app, that's free. Well, that's great, but you're basically trying to drive creative cloud subscriptions, you know? And so that's, you have to trace that thread to say, did this person use this app? Okay. Did they become a subscriber? And that's hard to do in large companies and, and trace that thread.

And so I think a similar challenges in other corporations is okay, if we do something for free, if we don't charge for it, how do we measure it, actually move the needle. And that does take working with other departments and other groups and tracing that user across your company. And I think sometimes that just seems so overwhelming to people that they just kind of don't really take on that challenge. So I think viability, it's just a, we do have to spend more time on not just the business plan or the use case, but also how do you, um, how do you measure that outcome and how do you trace it across your company to say, okay, we really did move the needle.
Melissa:
Yeah. One of the things that I hear too from people and, um, I, you know, experienced this myself, some of the outcomes that we want to measure, they're going to take a really long time. And when we're experimenting right. To increase retention or something like that, right? Like maybe that's the hypothesis we're doing inside of a bigger company. Uh, but the retention is measured year over year because like creative cloud subscription, sometimes we have them once a month, but sometimes people are in yearly subscriptions. Like, what's your advice for people when they're breaking down experiments, especially in larger companies where the feedback cycles aren't super quick. Um, and they want to measure these outcomes, but they don't, you know, it's going to take a really long time to do it, but they want to do rapid experimentation. How do you like work with them to get the information they need?
David:
Yeah, it's challenging. I think, uh, I look for leading indicators. So, you know, when I work with retail stores and they have a subscription, it's an annual subscription, but you can do things like checking with your customers. Are they getting value out of the store? In the meantime, you can track their car versus how they're exploring the store and getting value, um, seasonality, right? So if you only have a seasonal business, let's say you have a Halloween store and it only comes up on Halloween and how it'd be measured, you know, from year to year on that. So I tend to look for leading indicators. I can, what are the leading indicators that would predict that someone's going to renew? And I think that's always a fascinating conversation because you should be able to start thinking through, okay, what are these touch points that would lead to somebody renewing and how do we remove the friction from that?
David:
And so, um, it's similar when I do, when I do workshops and people say, well, we have to make a hundred million dollars or whatever in five years. And we're just going to work for five years until you got, if we hit that goal, it's more okay, what can we do in a month that would give us a leading indicator that we're on the way to hitting that goal. And so I really look for leading indicators, what are the experiments we can generate to have like directional evidence that we could possibly do that thing? That's really big.
Melissa:
Yeah. I get a lot of pushback when I talk about that sometimes from these like crazy statistics, people who go well, you know, it was a correlation or causation, like know, do we, how do we know that leading indicator is going to directly result into retention at the end of the day? And I'm like, yeah, you should hopefully have some data that points that way, that like the more people are engaged with their product and the more that they come back to use it, the more likely they are to be routine. But, um, I feel like everybody's looking for a certainty around some of those things and it just doesn't exist, but you know, you can be a little bit more confident than you were before. And that's the whole point.
David:
Confidence is one of these things that I've been, um, our friend tonight, Vicki who's associate partner over at strategizer. Um, last time we did an in-person session together, he was testing business ideas and London. And, um, I was able to hang out with him, Ben, and Alex's two older and we were having this conversation about confidence and it was really about, you know, how confident should you be? And I think that, I think what we're learning is a lot of the techniques we, we teach are very much, uh, closely aligned to social science than anything else. And so if you look at it through the lens of social sciences, then you start to understand, well, I'm trying to understand like, are people are? How confident should we be that this is the right thing to build? Or we should scale this. And I think where we find challenges with teams is like, there's this misaligned confidence.


You have some people that interviewed five customers and they're super confident you should build in school at the scale of the whole thing. And what I try to do is walk them back and say, look, that's rather light evidence. It's qualitative, it's light, it's directional. It's great. We can use the quotes from those five customers, but that's not a sure thing. And so there are all these steps you can do between that and building the whole thing, which I try to lay out in my book, which is to give you more confidence, whether or not you're on the right track. So I've been talking a lot more about confidence as something that you, you have to be aligned with your team in calibrated because, um, just because you talked to 5 customers, it doesn't mean you should go build the whole app and launch it. And so I do think we should talk more about confidence and how we should look at things. And I do think we should pull from social sciences because, you know, there's a lot of great work there and inspiration we can use. And I think that's closer to what we're trying to do.
Melissa:
Agreed. And I think too, when we start thinking about things like as confidence levels, uh, it also makes it a little bit less daunting to experiment, right? Like I think, especially in larger companies and you tell me your experience here, but I see a lot of companies, uh, afraid to experiment. Like they want to experiment, but they don't. They do all of these experimentations that are so low risk or just like out of the point, because they're easy to do. And they're, they're like, oh, I'll, I'll do it over here because like, oh, I can, I can run. I can run the test over here. I did an experiment where like check the box rather than really identifying what the risky assumption is. And I think it's because they look at these things and you don't think about the confidence level, right? Like, and so they run the wrong experiments and they're not thinking about it as like, how do I just get a little bit more sure that we're going in the right direction or reduce the risk in that way? Uh, when you work with large companies to do, like, what do you find are like the biggest hurdles to getting them to do the right experimentation, to like get them to do really good experimentation around the riskiest assumption.
David:
I love that. Uh, I asked this question, every masterclass I do, um, you know, I do this big public masterclasses with, with Alex and my coauthor. And, um, it's always the same answer. It's always time. Time is number one. Uh, of, of we have a list of things, you know, is it IP? Is it branding? Is it legal? Is it all is it's always time. And so I still think we're kind of stuck in, I hate to say the industrial era, I'd like to think this endemic is sort of accelerating us out of you. Think of it in a good light out of the industrial era where we're distributing work. And we're thinking about new ways of working. But I think this time it comes back to time. And so if you don't have time, then you're going to do the easy thing. But I think what I try to stress in the book and in my work is that the point is not to run experiments.

Like I've given you a list. There's plenty more than that, but that's not the goal. The goal isn't to run experiments to go is to, de-risk what you're working on to make better investment decisions. And so when you look at it in that light, you know, that fun, like landing page thing you can do, or the ad you can run or that little feature stuff you can spin up, like it has to be tied back to some sort of, you know, hypothesis that, that helps you pay down the risk and the thing you're working on. And I don't think we quite get that yet. I feel like there's some energy around experimentation. It's fun. Yeah. I can do it, but if you're not paying down the risk, um, kind, kinda, it's kind of pointless. So I think, um, just being able to keep that top of mind, that it's not about the experiments, it's about the risk and paying down the risk and making better, better bets.

And so, um, I really try to keep that language when I'm talking to people inside corporations, I try not to say let's beat, let's build a repeatable process for testing our hypothesis and running rapid experience. Like they just tuned me out. You know, I try to speak in the language of a, how do we, de-risk this thing we're working on? How do we make good investment decisions? And even though I'll use all these tools and techniques I talk about and help create, I'm really trying to keep the eye on the prize, which is we don't want to build stuff that nobody cares about in waste time, money, and people's energy doing so. And so can we play smaller bets along the way? And so I really changed my language over the last couple of years. I kind of learned that the hard way in dealing with more board level and NC level folks, it just take talking about words that they understand and care about. As soon as I throw let's use this canvas in front of them, they feel like, you know, I'm more excited about the canvas and helping them out, and that's not the case, but that's how they interpret it. So I really, really try to watch my language and the words I use when I'm, when I'm dealing with folks in the companies.
Melissa:
Yeah, I agree. That was the lesson I learned as well. Um, especially like connecting things back to like the outcomes they want to achieve from a financial perspective and drawing the line back to experimentation that way I find that incredibly helpful. Um, one thing I've noticed though, with larger companies, I want to hear how you've successfully implemented this. Um, this has happened to me quite a few years ago. I wanna say like seven years ago, but when I see larger companies try to teach experimentation internally or get their teams to be more experimental, um, I find out it doesn't stick. Uh, one company. I can tell you why it didn't stick, but I want to hear like how, how you've found it sticking, um, how you make it stick. Uh, so like one company tried to pull some teams out of normal work and turn them into like a lean team.

They called them, um, and teach them experimentation. But we spent, I spent like six months coaching them, taught them how to do experimentation. I found the things that they were experimenting around though were like meetup stuff from the company that had no bearing on anything whatsoever. Right? Like they were just, oh, here's some random stuff, go learn experimentation. And then when it came down to actually executing on the experiments that they ran and, you know, following them through, uh, they were like, oh no, you're done now. Like your lean team move on to the next experiment. Right. Instead of like turning it into a product, um, and this whole team got like incredibly demoralized. And they were like, well, we want to work this way. And like, we want to build great products, but like you gave us junk to work on basically. Uh, and they all, you know, they all disperse back into the organization and, you know, were really upset that they didn't follow through.

And I was like, man, I just spent six months teaching these people how to do this thing, but there's no follow through. There's no, like see-through. So I feel like they, they think about, you know, what you were saying experimentation for the sake of experimenting, like check, we ran a bunch of experiments. We did it. How do you find, like what, what do you find is the right setup and the right conditions for a company that really does want to learn experimentation? Not just train a bunch of people on it, but like wants to be experimental. What do they need to do to make sure that that sticks?
David:
It's a really, it's a big challenge, you know? Um, I think I won't name it. I won't call out companies, but the companies I've worked with, they were very high-profile billion dollar companies in the past couple of years, uh, where it didn't stick was they took this check the box mentality to it. So I ran experiments check, what else do I need to do to just launch the thing I already want to launch? And if you take that, if that's your mindset, then no amount of training is really going to help you. Um, you really need to talk about, you know, how do we, how are we de-risking things? How do we changing our mindset on how we approach things? But if it's just another box they have to check to get to the thing they want to build, then it's not going to stick.

It's going to be theater. And so I've been really trying to one work on real ideas. So I don't talk about this publicly alone, but all my work behind the scenes, if you don't hear me, you know, you don't hear from me for a while. It's because I'm behind the scenes working on real businesses and real ideas. And so what I've tried to do over the course of, I'd say the last five to seven years has been, uh, every time I do a workshop, it's on real opportunities. The teams are trying to figure out. And I think, um, that has been, that's worked out really well for me because I can take something that they're kind of worried about that has high uncertainty. And I can take the people that know about that thing are working on it. And I can say, look, here's some new tools for you to think about this a different way, and we're going to practice them on your real opportunity. So the way I've designed, all my, all my stuff has been no, it's really, it's really been a learning experience for me, but, you know, I'll, I'll introduce a concept, I'll introduce a fun case study. That's really short. And then I'm like, okay, now we're using it on your real stuff. And I feel like dual track of, yeah, I'm learning a new tool, but I'm also learning it on something it's a real opportunity has, has really, um, is really helped that a bit as far as just making it stick. And so I don't really do training anymore where it's like a team that's pulled out and they only work on new stuff, or I don't really do training for just functions anymore. It's almost always like a cross-functional team that is working on a real opportunity. And, um, I feel that that's helping it stick.

Now you still have to have leadership evangelize it and say that it matters and actually show action that it matters. And so that takes time. You know, there are some really prominent companies I've worked with in Silicon valley who were like the poster child of doing this way of working and they stopped talking about it and everybody in the company stopped doing it. And they were really surprised that the quote they said to me, which I'll never forget was, um, it was in our bloodstream, but it wasn't in our DNA, which meant, uh, it was, it was something that was happening. Right. But it wasn't really a part of how they worked. And as soon as the leadership stops talking about it, people just stopped doing it. And they went back to working the way they did before and they had to restart that entire program again.

So I feel like, so those of you listening, you know, if you feel like you're being too repetitive or, oh, well, people got this now I'm just going to move on. Don't keep repeating it. It's part of your job as leaders to keep repeating this and showing it and enabling it and creating a culture and an environment where people can work this way. It never stops. So if you stop, the teams will stop working this way and they'll go back to what they're really comfortable to them. So that's something that I've learned over time as I watch this ebb and flow of companies over the last 10 years or so of my career, that when they stopped talking about this, they just revert back to how they were doing it before.
Melissa:
Yeah. Oh my God. I love that quote was in our bloodstream, but not our DNA. That's amazing. And it's true. It's like, it's not embedded. I also told leaders too, when it comes to strategy and stuff like that, I'm like, if you don't feel like you repeated it 40,000 times, it did not sink into your team. Right? Like you can't just be like, oh, here it is. And then walk away. And I think you have to be so repetitive as a leader to like, make sure it's really sinking in and people are doing that. Um, with the, sorry, I just lost my train of thought. We're going to cut this for a second. Let me go back. I was like, oh, I have something to say about that. Um, okay. I got it now. So when it comes to leaders too, I get the question all the time.

Like how can we experiment around our product strategy as well from like a higher level? So let's say like, um, we've got this whole company moving forward. Teams are already out there. We're not giant corporations. So we don't, you can't really just like pull out teams. Um, how do we test things like these going up market, the right move is going down, market the right move over here. Uh, what types of experiments do you see at that level happening? And what would you advise for like a product leader on sending capacity to like look at teams to go after that? What types of experiments do they run?
David:
Yeah, I think a couple of ways. Um, I've seen this theme in the companies I work with where they're trying to take what they are already domain experts in and have a lot of expertise in, and they're trying to reimagine it in a different way. So when I'm working with paint companies, so I work with one of the biggest pink companies in the world. They're trying to become a color company. They don't want to be known for paint anymore. They want known for their color expertise. When I work for automotive companies, they want to be known as mobility companies. Now they don't want to be known as car companies. And so when you start looking at it through that lens and you have leadership, that's buying into that and they say, you know, oh, with our strategy, we want to sort of re we want to use our expertise, but do it in different ways and monetize it in different ways.

You know? Um, then what I tend to do is say, okay, well, let's model out what that could look like as a business model. And let's look at our risk and that's run like the smallest experiments that would have to be true for that to even begin to work. And, you know, I've done that at the C-level and it works surprisingly well. Um, and, and I'm kind of not asking them to do something that's completely out of their domain, right. I'm asking them to leverage what they're already good at, but in a different way. So with car companies and you can look at the commercials and you can see what people are doing. You know, some of the car companies I work with that are now mobility companies, they're investing in, you know, AI startups and Mo urban mobility, startups, and, um, robotics and things to kind of feel out what's next. You know, what is the next about mobility? It's not just about putting someone in a car and selling a car. Now it's about getting somewhere a, to B or helping somebody become more mobile than their home even. And, and so I think when I work with leaders like that, it's just really refreshing. And what I try to do is just say, how do we test our way through that strategy? Because it's not a, even though you're an expert in it, it's not a sure thing to just go execute on that strategy. And now you're making all this money in a different, quite a different space. So I use somewhat similar, same, uh, same framework for them.
Melissa:
Cool. I like that. Yeah. And I, like, I find that it's the same type of concept as if you were, de-risking a feature to de-risk your strategy to go into that area. Um, and I feel like we'd need to be experimenting every single one of these levels as product people. But I think sometimes like I get this argument a lot more from startup founders. I don't know what, but, um, that you can't like, I know why, but it's like, you can't experiment around like truly life-changing ideas and stuff like that. Right. They always throw out the Steve jobs, iPhone thing, like, you know, you can't experiment around that. I'm sure you have a standard way to answer that question. What would you say to those people who are like, uh, you know, we can't just test it with users because they can't even conceive it. Like we're the all knowing Omni, powerful people over here.
David:
Yeah. You were laughing when you asked that question and I know exactly why. Um, and I just had this in a recent, I work with a lot of accelerators, um, not just in Silicon valley, like all around the world now, virtually. And I just had one where I, our first 30 seconds of this workshop, I knew that this one founder was not coachable. Uh, you can tell it pretty quickly. They explain everything away. Um, they have all the answers. They often quote Steve jobs at you or Elon Musk. Now Elan's a new Steve jobs. Um, and why do these seem in Silicon valley? I get even more of that, you know, um, I had a group of 20 blockchain startups come through and, you know, everything's like, well, we can't test that because it's on the blockchain and all that, uh, or AI or machine learning is another great one.

We can't test that because it's machine learning. Then what really happens is most of the machine learning startups, I, I chat with, they're just doing spreadsheets behind the scene and there's no AI and machine learning and they're totally wizard of Oz experimenting with it, uh, and trying to drive value with customers without necessarily. Cause I I've met startups where they can't even hire a data scientist because no one knows who they are and they don't want to work with them. So they have to do it through spreadsheets and stuff. So, um, I feel like just be careful even when people quote Steve jobs, you know, if you look back, um, I think people miss construe a lot what he said and I, and I've, I don't work with apple, but I had some friends there and over time I've known some people that worked with him. And when you think about it, it was more about, yeah. When you talk to customers. Yeah, you're right. It's not their job to, uh, solve or tell you how to solve it, but you have to get to the need behind the, behind the request, what they're saying or the job that they're trying to do behind it. So it's kind of your, your role to figure out how to solve for that. It's not their job to tell you how to solve it. Um, but if you look back even like at the first iPhone, I feel like, wow, we just rewrite history all the time. You look at the first iPhone, um, this, the, the hardware and the software, and you look at how that was pretty much a big experiment for apple, right? I mean, it didn't have copy and paste. A lot of like MMS, like stuff you would think that would a standard, you know, like Blackberry would have at the time.

And yet they put it out in the market anyway and learned from that and iterated on it. Um, so I, I do think just be careful with how coachable people are. Um, I have this thing where I try not to inflict, help on people really don't want the help. You know, it's not, everyone's going to come around to this way of thinking. Um, and I work with a lot of startup founders and it's somewhat endearing, you know, they have their reality distortion field around them and they believe they're the next Steve jobs. But in reality, like they have to test that vision against reality. And, um, you know, if they get it right, they're brilliant and can say, okay, kudos to you. But so, so many get it wrong. Um, the first startup I joined, we thought we were, we thought we were B to C and we ended up being a B2B company. So I think that really, um, you know, was eye opening for me and my career early on. And I'm glad I got to experience that early on. So yeah, just be mindful of how coachable they are when they quote Steve jobs that you would be thinking like, yeah, there's some truth, but you know, it's not their job to solve for it and build your features. It's your job to do that and try to get the need behind it. And it's not like apple didn't talk to customers. Um, so I do think, um, just be careful. I look for conversation stoppers, you know, I look for things that people say that I know they're not coachable and I'm like, okay, well have fun, you know, good luck to you and kind of move on.
Melissa:
Yeah. I think that's all you can do at those places too. I feel like I used to spend so much energy trying to convince people otherwise, like when I first started doing this and then I just realized, I'm like, oh, I'll just let them fail. It's fine. Like, I, I like, why, why am I wasting my time? If they don't want to listen to me too, they don't want to listen to me. Um, which was a good learning moment. I also find this attitude though, not just with startup founders, but also large companies who do very complicated things or companies that do complicated things. So I'd say like healthcare, right? Like, oh, we can't possibly experiment around it because we're healthcare or we're insurance. And, you know, we are really important and we've got so much regulatory issues and all those things around it. What have you seen, um, those kinds of companies do to be successful with experimentation?
David:
Yeah. I've, um, unintentionally spent a lot of the last couple of years maybe because of COVID with healthcare and with pharma and, um, have to say it's been really eyeopening to me how expensive it is. Uh, I think I had a team recently and he said, uh, we're going to run a survey. And I was like, okay, well here's MailChimp. And they're like, oh no, we're doing this survey. And it costs $30,000, I thought. Hmm. Okay. That's a different world. You know, explain why it costs that much. Um, we had, I had a team that was going, we're going to do a landing page. And I was like, oh, that's great. Here's Unbounced, you know, this is going to be like a hundred dollars. And he said, oh no. Uh, our, our agency is going to do that for $20,000 and it's gonna take six months and they want us to send them a TIF file. And I was like, oh, well, what is this? The nineties? And so I think so much of it's like not being exposed to, um, modern tool sets for experimentation. And also it comes back to, um, patient privacy laws and things like that. You know, it's not like you can just, just throw up a type form and have patients enter their data into it. Like it has to be HIPAA compliant. You have to like all these things and it feels so overwhelming to them. So what I try to do is just start unpacking it with them, like be visual with it. We'll start creating canvases for different types of customers. Like your user customer, your influencer, your economic buyer, your decision maker, and start unpacking all that complexity. And it's, it's like they're carrying it all around in their heads, you know? And when you do that, yeah, it's going to feel completely overwhelming to do anything.

So what I try to do is help them visualize it and unpack it. And then we just start systematically going through it, like what would have to be true? You know, if we go into a hospital and we're trying to get the administrator, but we can only work with, let's say nurses, um, then are they willing to write a letter of recommendation to their administrator recommending our product? Cause they say they love our product, but will they write that letter? And you're going to find out really quickly and that experiment, are they willing to even basically give you like any of their reputation and social capital to recommend it? And some of them don't and then you say, well, wait, you just said, you loved it. Like, yeah, I loved it, but I'm not recommending you to the administrator. And some do write that letter and you're like, okay, wow, okay.

This is finally something like, we have real evidence, you know? And so there are little things you can do. Um, I've been using letter of intent quite a bit, you know, it's like, oh, we have to do a 300 page contract. I was like, well, before you go from, yes, we'll do this to 300 page contract. Can you do something in the middle? Which is like a letter of intent, simple one page, non legally binding, but they have to put it in writing. And so we'll do something like that. And of course we loop in legal and we make sure it's okay. But I just think there's so much anxiety and so much complexity that, um, yeah, if you don't visualize it and unpack it and then start testing your way through it, like it's going to feel completely overwhelming. So I really try to help coach them through that process. Okay.
Melissa:
Yeah. That makes sense. I feel like you also don't have to. I think if we keep in mind the whole concept of like, de-risking, like you said, like becoming more confident, um, it makes experimenting easier or like more obtainable to people because it's not like an end all be all with experimentation. I find like when it comes down to really big things are very complicated stuff with healthcare. Like I've worked with tons of health care companies recently too. Um, you know, they look at all the experimentation, like a B2C company would have like, oh, I can AB test that product and know immediately whether it's going to work or not. Um, and what I love about your model and what the assumption mapping is, it's not necessarily just about like testing the concept itself directly. It's about getting more confident as well about the key assumptions that would go into making that product better. And I feel like that's just more obtainable for some of those companies as well, instead of thinking about it as like, yeah, let's just put like, uh, the whole product out there and like, see if it works. Right. So it's about figuring out the pieces are really going to come together. Not the whole thing.
David:
Yeah. A lot of the experiments I've been drawn into, especially pre and right before the pandemic and during the pandemic had been telehealth, tele telemedicine, tele dentistry, and you know, something that felt so overwhelming, like, oh, we have to build an app for that. Right, right away. Um, you know, I have teams that just do FaceTime. Uh, they just line up patients and doctors and they have them FaceTime each other and do it that way to see what's what's um, what are some of these insights we're going to gain from just real people getting value out of it and what you learned from that you can use to inform your app, if you, so, so do so. Right. Or define that, like create a platform. So, you know, you can use existing tech a lot. So a lot of things have been, you know, digital now because of COVID. But I think, um, yeah, you can certainly test your way through this. It's just more of having leadership support being, having somebody that can coach you through it and unpack this stuff. It's all jumbled around in your head and just feels completely overwhelming.
Melissa:
Yeah. That makes a lot of sense. I imagine too, with like the tele-health stuff that you're doing and the experimentation, when we get into, especially these types of products that have so much bearing on like people's health and their wellbeing, there's a lot of ethical implications about experimenting. I know this is something that you were passionate about. Um, what have you seen go awry when it comes to experimentation that actually really borderlines ethics and you're like, that's not good experimentation. How can these companies kind of mitigate that risk to,
David:
Yeah, this has been something top of mine for awhile, for me. Um, you know, I think we've all been in situation, our career where we started, we were asked to do something, we go, am I going to regret doing this? You know, like there are these moments in our career and as we grow older and we have more experienced, I feel like, um, we start to have hopefully stronger opinions about it all. Um, you know, I wrote in the book is kind of this idea of testing with your customers, like not testing on your customers. And I think that's still my guiding principle of if you're doing something, it feels like you're testing on them. Um, really take a pause to say, is this something, is there a better way that we can do this? And so I've been really pushing, uh, teams to work, do a lot more co-creation, you know, a lot of the preference and prioritization experiments I have in the book, which I pulled from Luke Hohmann and innovation games and some other sources, you know, uh, I've been trying to do more of that where, you know, um, can you do something that feels more like co-creation versus just testing on, you know, your customers? And so, you know, I've been interviewed about this topic a lot over the last, say six months to a year. And I do see other people in the industry start talking more about it. And I feel like, yeah, ethics and product and business, of course, that matters when we come to experimentation. I do think again, pulling from the social sciences, cause even the social sciences have guidelines about what you should and shouldn't do right. And not causing harm. And so I do think there's a great foundation there because a lot of what we do again, is close to social sciences, applied to just product and business. And so we don't have to recreate everything we can pull from that. It's like a great body of knowledge. And so what I've been trying to do is help teams think through, okay, can we test with our customers here?

And, you know, uh, I made some, I haven't talked a lot about this, but I made some explicit decisions in the book not to include experiments that I did not feel were ethical. Um, I used to, you know, help run lean startup meetups in San Francisco. And I've seen a lot over the years and I'd see some teams come up and talk about stuff. And I was just like, wow, you did that. Like, oh, I have to go take a shower now. Like, it just feels dirty to me. Right. And, um, I, I do think, you know, I tried to make some intentional decisions about not to include those in the, put them in writing and say, this is how you do this thing that I don't think is ethical. And so, um, things like, you know, taking your competitor's site and just rebranding it, your own and testing with customers. I mean, one that's not morally correct. And then also if you're a big company, that's a great way to get sued. So why would you go and do that? Um, just little things like that. And so, um, I've been trying to push this conversation this way for awhile. I think there's a lot more work to do, but I feel like this guiding principle of testing with your customers and not on your customers, if that's your guiding principle than some of the practices you use, you can start looking at those principles, seeing these aligned to the principles that we are espousing.
Melissa:
Yeah. I wonder where, like, how you think about things too. I've heard people say this to me. I could think of ways where you can get around this, but like, um, when you're testing things early on, you're delivering value to your, your customers, but you're taking money from them as well, right? Like let's say, um, you're testing a service. You're gonna, you're going to totally concierge it, deliver it manually. Um, but you want people to pay for it cause you want to see like people were actually paying for the service. I've had a lot of people be like, oh, that's unethical or larger companies be like, we can't take money for that. We're not delivering them a product. Is that really unethical? Or have you found ways to be able to do that and test that and what should be the guidelines around it to make sure that you're, you're being good to your customers and you're doing right by them.
David:
Yeah. I think I don't completely agree with that stance. I think, you know, basically it doesn't always have to be a product in which is the mechanism, which you deliver value. Right. Uh, so if you really focusing on customer value that doesn't always have to be a product, um, or a certain, like there's a different ways you can deliver that value. And I think we've gotten too caught up in just it's all about the product. It's all about the features and how it looks and everything. But you know, when I teach things like a business model, canvas, what I've been doing more often is taking their product and putting it in the backstage is like a key resource. That's like just thinking of your product, like a resource, uh, it's not so much about your product. It's about the value that you deliver and your value proposition to your customers.

And so when you think of it that way, you know, it's kind of similar to my, my comment earlier about companies. Re-imagining what they do and how they deliver the value because of their expertise. Like how do they deliver that and monetize it. And so when you think about it, it's not so unethical to think of, well, I'm delivering value and they're paying for that value. Um, that should be okay. Uh, I mean, so the vehicle in which you deliver, it can be various different ways. And so, um, yeah, I try not to get too caught up in, I feel like it's a very product centric, uh, approach to everything, right. It has to be about the product, but in reality it should be about the customer and you're, you're delivering value and you know what, uh, Alex and I try to do, we try to frame this as a high value jobs, right?

So if you're solving something that's really valuable to a customer and it's valuable to you as a company, then you should, uh, you should charge for that. You know, uh, it doesn't mean you scale. It doesn't mean you're charging a premium for it, but, um, yeah, I think that's testing viability, right? That's how you start to testify ability. So yeah. I don't know if I completely agree about that stance. I do think you just focus on the customer value and customers will pay for things, regardless if it's a product or a service or whatever it is, they don't care so much about it. They, they care about the value.
Melissa:
Yeah. That's always been my stance on that too. And I'm also like, if you don't follow through with delivering the value, like give them their money back, right? Like it's, it's like that. I think that's a totally unethical one, but that's always an option where it's like, if you can't deliver it, or if something happens, like you give them their money back, say, I'm sorry, you explained what happened or whatever. And you kind of move on. Um, but if you can deliver the value, people just want their problem solved, right? Like that's what people are looking at. Whether you do that in a good way manually, or you do that with a product, it doesn't really matter. As long as people's problems are solved, that's what they're paying for at the end of the day. So I totally agree with that. Um, when you think about companies out there, there's a question I get asked a lot and it's hard for me to pinpoint them, but you've seen a lot too. Uh, when you think about all the companies out there, are there any ones that stand out in your mind as like, man, they are doing fantastic experimentation or seeing them do really good at that?
David:
Yeah. I usually have my go tos on this one. Uh, you know, some I've worked with some, I haven't, um, I do mention Adobe a lot because I do think they're doing some, some really smart things. Right. They're democratizing innovation over there. And, um, testing ideas with, you know, things like the Kickbox program that Mark Randall kicked off several years ago, which was, you know, anybody in the company can opt into this process where they can, they can test an idea they have and, and they call it a box. Cause you get a box. And then basically, you know, you, uh, you present what you've learned, you get more funding to run more experiments and you get all the way to where you can get, you know, a team and stuff. Um, you know, that may not result in a real new product idea, but you're leveling up people where they can then take that energy and apply it to their day-to-day work.

And then they're also testing ideas on, you know, new apps and everything. If you look at the stuff they're building, it's, it's, it looks like almost like a startup would build it. You know, it's, it's really focused on the needs of the customer and everything, and they're moving really quickly. I still point to into it. You know, I think they've been at this for a long time. Um, they have their, uh, design for delight program, which is agile and lean startup and design thinking kind of mashed together and actually a case study in my book from there. Um, follow me home program, which sounds creepy, but it isn't, it's just basically ethnographic research and they do get permission to do so. Um, you know, uh, I look at companies like the, kind of the new wave of startups, right. That are scaling now that are no longer startups.

So you look at companies like Stripe, um, you know, there there's a lot of really cool things happening there. There's not a bunch of companies like a point to yet that I would say is a model of this. But I do think there's a bunch coming because you look at how startups are being build now. Um, it, it, they are using different techniques. You know, we've been at this lean startup thing for what, 10 years now. So it's, we've been putting in the work, you know, and, and so now you have this kind of new wave of companies and I see them come through all these accelerators I, I judge at and, and, um, advise and these VCs I work with and I'm like, Ooh, they are going to be super interesting when I started scaling because of the way they're kind of testing their way through things. And so I do think we're going to see a wave of these coming, I'd say in the next five to 10 years. Um, but yeah, I can't point to a lot of ones and sometimes when I point to them, you know, it doesn't last. So like you said earlier, you know, um, kind of the bloodstream versus in a DNA comment, um, when I do point to some and I say, these are all the things we're doing awesome, that might fall off with leadership change and everything else. So, um, yeah, there, there are a few, I would say, hopefully there'll be more soon and you know, maybe in the second edition of my book, I can include some of the other ones as well.
Melissa:
Yeah. That's great. Yeah. Into it. And Adobe, I know there's two, um, HBS cases on them as well, and I've read them and I'm like, wow, that's really cool what they do with the experimentation, um, fantastic processes for large companies, especially if you're a large company looking to understand how people adopt experimentation. But the key part there, I think too, is they all started with top leadership saying, we want to do this way. Like both of those is just really starting at the top. It wasn't like a homegrown thing from the ground up. They were like, we're we want innovation. We're going to empower people to do this. And we give them the space to do that. And I, I love those two approaches. Um, that's fantastic advice. So if you are somebody who's looking, looking to get into experimentation, learn more about how to do this, uh, what resources would they, would you point them to including your book?
David:
Yeah, obviously I would point them to the testing business ideas, look on Amazon. It's almost two years old now as we're recording this. So it was released in November of 2019. So I'm very excited about that. Um, there's also a great other resources out there. You know, my, my coauthor, strategizer, there's a great set of tools there as well. And I, I closely partner with them. Um, you know, now when you go online and search for, you know, lean startup and design thinking, there is a ton of stuff out there. Um, I gave a talk with, uh, folks at Google before the pandemic in San Francisco and they had this timeline of like how many teams are doing sprints. And it was almost like nobody. And then everybody, like it was up until the right. And so there's so many options. I would actually point them to your stuff as well.

I think you were a great resource. Um, I love Theresa Torres his work, uh, Marty Keegan, like all the, all the crew that we know. I mean, I think we we've stuck together over the years because we really believe in this and, and we really try to Excel and help people. So there's, there's a good bit out there. Um, so I would just say it, shouldn't be hard to find it's more about how do you adopt it and make it real for your company and your journey. So, yeah, there's plenty out there certainly much more than there was 10 years ago when everyone was crazy writing blog posts on this and you're like, oh, this will never work. Uh, you know, w we're sort of the early adopters here. So, uh, we're, we're still around.
Melissa:
I'm so happy. This is more mainstream now. And you don't have to like pull teeth to, to get people to like think this way. Uh, it's definitely made it easier. I think, to do both of our jobs. Uh, another there's so much more out there and people are like, oh, it's not a fad. It's not going to pass in like three, you know, three years. This is, this is here to stay this whole mindset. Uh, and if people want to learn more about you, David? Uh, where should they look?
David:
Yeah, I'm still pretty active on LinkedIn. Uh, although it's mostly means, but, uh, come from the meme, stay for the, you know, testing business ideas, advice. So I'm active on LinkedIn, which I never thought I would say. Uh, I'm also really active on Twitter at David J bland and then my company pre coil, P R E C O I l.com. I have a ton of free resources up there. A lot of the templates I use with assumptions mapping and videos and articles I write. So yeah, check it out. Okay,
Melissa:
Great. Well, thank you so much for being on the podcast, David, and it's been a pleasure having you for those of you listening. Make sure you subscribe to the podcast and leave us a review. If you enjoy this in stay tuned next week. For another episode of dear Melissa, we'll see you then.

Guest User