Episode 73: Pivot Series, Part 1: Embracing the Unknown with Colleen Johnson
Melissa Perri welcomes Colleen Johnson to the first episode of this four-part miniseries about companies that successfully made major pivots during the pandemic. Colleen is the Co-Founder and Chief Product Officer of ScatterSpoke, a company that leverages AI to make the most out of retrospective feedback. Colleen tells Melissa how their company needed to pivot quickly to win against competitors, how she had to shift from being a subject matter expert to embracing uncertainty and curiosity, her version of a valuable MVP, and which retro data she finds to be the most valuable.
Subscribe via your favorite platform:
Here are some key points you'll hear Melissa and Colleen discuss:
Colleen talks about her professional background, what led her to found ScatterSpoke, and what services they provide. [4:31]
During the pandemic, when Scatterspoke lost clients to major competitors, they had to determine what made them stand out from other companies who provided the same retrospective services – the answer was a large quantity of retro data. [6:11]
Colleen advises listeners to approach change with an open mindset and to be a little bit more cautious. [8:56]
A friendly invite via their in-person professional network or even a cold outreach on LinkedIn can help a product manager launch a new product, connect with engineer leaders to provide them with data, test products, and offer feedback. [12:01]
In coaching teams and helping organizations adopt agile practices, most people tend to focus on delivery rather than breaking down the work. If you do not break down the work in a way that allows you to iterate and get feedback quickly, the whole pivot process has no benefit. [14:38]
The most valuable part of presenting small chunks to engineer leaders and customers is what you learn from their responses, positive or negative. [16:12]
To have a successful retro tool, the teams using it - rather than scrum masters and engineer managers - must see its value to their process. [19:46]
Engineering managers and product leaders need to understand that retrospectives are important because they help pinpoint issues in the organization. [20:11]
As a person working in product and product management, Colleen says that you have to “remove yourself from the subject matter expert seat”. You have to be curious and willing to learn and understand that you are venturing into waters beyond your scope of knowledge with this new transition. [26:45]
Resources
Colleen Johnson | LinkedIn | Twitter
ScatterSpoke | Twitter | Instagram
Transcript:
Melissa:
Hello, and welcome to another episode of the product thinking podcast. Today, I'm joined by Colleen Johnson, who I've known for quite a long time. She is the chief product officer and founder of scatterspoke, and she has been helping organizations, uh, do agile transformations for quite a while before founding scatter spoke as well. So definitely an expert in a lot of the things that I know you guys are gonna go through. So, uh, welcome, Colleen, can you tell us a little bit about, uh, yourself and how'd you decide to found scatterspoke and what is it?
Colleen:
Yeah, thanks, Melissa. I'm excited to be here talking with you today. Um, scatterspoke is a tool for online retrospectives and as, as simple as that sounds, it's taken a couple different turns over the last couple years. Um, I founded the company with my husband, who at the time was just interested in learning some new technologies. He was a, a software engineer and wanted to try out some new things. And I was struggling with connecting a couple different teams. I was coaching at the time in different states. And so we just wanted to, like, I was like, this could solve an immediate problem for me. And you can try out this new technology you wanna learn. Um, and so we played around with it. It was all public, but we didn't really, we weren't trying to start a business. We were just trying to solve a very specific problem for each of us and we put it out there and, um, there were no signups, it was free.
It was basically just a board, right. A board to post stickies on. Um, and we went through at one point and we were like, oh, it'd be interesting if interesting. If we had people sign up, like if you had to put an email in and, and that, you know, again, no features went in, no thought went into this. It just sat out there public for a year or so. And one of the one day we went and looked at it and we were like, we had, we've had 30,000 people sign up for this tool. Holy shit. like maybe other people have the same problem we have. Um, and so we started really investing a lot more time in it. And then in 2000, um, in 2019, we joined an accelerator, a bootstrapped accelerator program that allowed us both to focus on it more full time.
Um, I continued to do coaching and consulting on the side alongside of building scatterspoke because it gave me an opportunity to really stay close to our users and close to the community and bring all that feedback right back into our backlog, um, which was really important for us. So I was able to use the tool with clients, see it live, watch where they were struggling. Also see what, you know, listen to them, say, I wish it did this, or I wish you could, you know, I wish you had the ability to push or pull or whatever and bring all of that right back. So it kept us really connected to our user base. Um, and then you mentioned COVID so, you know, starting at the beginning of COVID, we, we were like, oh my gosh, we hit the, the lottery here. because everyone's gonna have to go remote and everybody's gonna have this problem now.
Um, and it was great for the first couple months we saw a lot of organic sign up. Um, we were getting a bunch of new, you know, new trials, new customers, exactly what you would want to have happen. And then everybody found tools like mural and mural and all the big players launched ways to do retros and those tools that you also already had licenses for essentially. Um, so we saw kind of a big spike and then a big drop and then a flat line. And, um, for us, you know, I I've always been the expert in, on the agile side of things, running great retros, facilitating team improvements, um, and bringing that into the tool. But we, you know, all of a sudden we're faced with, um, a need to pivot and do something different. And we said, what is the thing that we have that other tools may be mural or mural or competitors in the, in the retrospective space?
Um, ha don't have that we have, and the answer was a shitload of retro data . So we were sitting on all of this retro data from all these different teams about the problems they were having and what things were, you know, what were working well, what wasn't working well. And so we set out to build some really robust reporting, um, and leverage AI and machine learning that we programmed against that data set. So rather than buying like off the shelf machine learning tools, where they're used to looking at an entire block of text and, and ranking it or interpreting it, we were looking at these quippy things on, um, retro cards that were like, we need better unit tests. And so we went through and classified all of these data points that we had into categories and positive or negative sentiment, um, so that we could start to offer really robust reporting to an organization on what was actually coming out of these retrospectives.
So we found this niche all of a sudden that I was not an expert in anymore , um, which, you know, we spent, I spent a lot of my career coaching, agile organizations and helping them adopt agile practices. Um, and all of a sudden, you know, I think the biggest shift for me was this. Now I'm stepping into, um, working and selling towards an engineering leadership view into why this data is important and how they want it to improve and had to, um, not be the expert anymore. And I think it's actually been far more beneficial for us as a business and as an organization to look at what kind of features we can offer through that lens. Right. I think a lot of times when we are the expert in the product, we're trying to build, we're a little too close to the solution that we're trying to deliver.
Melissa:
Yeah. I think that's a really good point and a really interesting shift for you too. A lot of the product managers I've worked with and talked to in the past, especially ones that you probably ran into too, doing a lot of these digital transformations. Like we were the subject AC experts, and now all of a sudden you are the product manager and it's not quite the same. Right. You, uh, you have to start asking questions. What, what did you do to kind of get over the hurdle of assuming you knew all the answers, right. And going back and thinking through, you know, first of all, maybe admitting, Hey, I don't know, how'd you get over that like fear, cuz that could be scary. And then what did you do to try to do approach this from more of an open mindset and to be a little bit more curious?
Colleen:
Um, I think the first part for me was exactly what you said, right? Admitting that I don't know the answer to this. Um, and I'm not gonna be the expert in understanding what an engineering leader wants to see out of some of this reporting, which meant we have to go find those people. We have to find the people that are excited about solving these problems for their business and want to see this kind of data from their teams and coming out of their retrospectives, um, and kind of balance that out. So, um, what that meant was reaching out to our networks, reaching out to potential customers, reaching out to voices that are not the typical voices that I hear from, right. We needed different perspective to be honest. And so we reached out to a lot of different people and said, you know, if you're the leader of an engineering organization with 400 scrum teams in it, what's important for you to see what kind of data do you wanna know is coming out of this.
And then from my perspective, how do we balance that with some team safety, right? Because we don't want this to be an open box where you can go and see everything that every team has said in their retro. So we had to balance this with some agile perspective and some team safety, um, the importance of team safety, where we can kind of scrub the data and abstract the data in a way where we're doing, we're delivering what engineering leaders wanna see in reports like this. But we're also balancing that with keeping the safety of the conversation for the team. And so I knew that I had strong feelings about the, the team safety side of things and keeping the retrospectives, um, closed off to the team. And, and it meant that we just had to go find those other voices. Um, it was a lot of interviews.
It was a lot of demos of very shitty broken reports where we were like, oh no, you know, like every demo ever. just, just, I was on the other side of it this time where we were showing, um, reporting and saying, you know, is this, what would this tell you what's missing from this? Um, and we're still doing that. You know, we've got, we've got a, we call it the team pulse report. Um, but every time we show it to a new engineering leader, we get different feedback about what they'd like to see there. And so it continues to evolve. But I think, you know, to answer your question, Melissa, I think the most important thing I can do in that scenario is, um, get curious, look for other voices and get out of the way. Um, and it's, it's not an easy thing to do. I think, you know, we do often look at ourselves as the expert of our product or the expert of what we're trying to build. And, um, I've been in that position for a long time with scatter spoke. And I think it, it was very hard for me to get out of the way. Um, but very important at the same time.
Melissa:
Yeah. So one, one of the things that you were just talking about kind of, because I have a lot of, um, I think students doing this at Harvard and I hear this from a lot of product managers who are launching new product lines, uh, going out and finding these engineering leaders that you did not have the relationships with. You did not actually build something for them before. How, how did you even find them? How did you approach them? I, I think we get like scared to go ask strangers to write about what they need and where their problem is, but what was your process for finding the right people and getting in front of them?
Colleen:
Yeah, in some cases it was referrals. So we reached out to agile con contacts in the agile community and said, do you have somebody in your organization that would offer feedback this way? Um, and so a friendly invite rate gives you a little bit of a leg up over like a cold, cold outreach on LinkedIn. Um, but we did some cold outreach on LinkedIn too, and said, you know, we saw that you set up a trial, we'd like to walk through the data and tell you what we see in the data. And have you offer feedback about this reporting? You know, if it's, if it's helping, does it, you know, what's missing, like I said, um, so I think, you know, it's kind of getting into the mindset too, of like the worst they can say is no right. So we asked a lot of people and we got a lot of nos.
Um, but we just put it out there and tried to, to hear from the voices that we don't, don't normally hear from when we're doing the, you, the way we built this tool, right. We built this tool squarely from an agile perspective. And, um, and now it's kind of shifting the perspective to, to different users. And, um, and now that's trickling down into a lot of what we're doing. So we're still, we're still leveraging like different persona types. Right? Of, are you a scrum master master? Are you a release release, train engineer? Um, but now that persona list looks a little bigger. So are you a developer out on a team with no scrum master, but you still wanna run retros? How do we make that easier? Are you an engineering leader for, like I said, like an organization with 400 scrum teams, um, what is important for you to be able to see in these reports? And so we've really just expanded our, our personas, I think, to, to consider a lot of other perspectives that we didn't start with.
Melissa:
So being a chief product officer too, I find that product leaders sometimes, you know, we're trying to balance like how to be curious and not know all the answers, but also how to give direction to your team. Right. And not look like, I don't know what I'm doing over here. so, uh, for you as you're going through this journey, right. And in moving from subject matter expert into being more curious, learning, talking to customers, how do you maintain confidence in your team and how do you kind of like reset the path for them on this pivot of like, Hey, you know what, this didn't work. Let's, let's try something new. We're going after this direction. How'd you, how'd you kind of go into that uncertain territory.
Colleen:
I think there's a couple things that, um, that we try to stay grounded in and also really struggle with . And I would say the biggest one, which, um, should, you know, hopefully this helps everybody understand that this isn't an easy thing to do, but getting the, getting what we're putting out as small as possible so that we can learn quickly and try something new. And, you know, I, when I coach teams and I help organizations adopt agile practices, we tend to spend so much time on how we deliver and not enough time on how we break the work down and you can do scrum or kanban by the book. But if you're not breaking your work down in a way that's allowing you to iterate quickly and get feedback quickly, none of that process matters. Um, and I, you know, it's one of those things where it's easy to stand in front of a room and teach that or coach that.
Um, but doing it and practice is really, really hard. And so we struggle with that a lot. And that's the thing we always try to come back to is what's the smallest thing we can put out and get feedback on quickly so we can figure out if we're on the right track. So small chunks, um, small chunks that you can learn from, and not planning out a quarter or six months worth of work, that's stacked back to back. Um, but really saying like let's throw something small up and see what we can learn and, and keep going. And, um, and I think that is continues to be the thing that we have the most room for improvement on. You know, we start, we'll start a release and, um, have one, you know, we'll be like, this is what we're gonna put out next week. And, um, then five more things get added to it and we'll get all these, like, we'll get Intercom messages that are like, we need this. If we need that, or can you add this? And we squish, 'em all into the same branch. And then six weeks from now, it still isn't deployed. Cuz we overloaded it with, with things that were relevant. And so I think keeping laser focused on those small chunks help quickly so that, um, we can learn as quick as possible.
Melissa:
Yeah. The small chunks is like really important. So, you know, you're, you're in MVP territory and I feel like, huh, startups, MVP. This is totally all the stuff I used to preach about for such a long time. Yeah. How do you know when this is my favorite question to ask? How do you know when you had enough value in your first small chunk in your first small iteration? Cuz that is the thing that I get pushed back on all the time. Even like 10 years later on this conversation about, you know, we can't possibly put it out. It's too small. They need the whole thing. Like you're basically building for like engineering leaders at large companies, right? Like you're dealing with really large companies. How do you like, what's your story of how'd you go through chunking it, like how'd you test it and be like, keep it small and keep that discipline there.
Colleen:
Yeah. I mean, I think it's less about, um, their response and more about what we learn in some ways. So sometimes the best small chunk is one. That's gonna tell you you're on you're you're down the wrong path. Right. And if you can deliver something small and have somebody tell you, this is not what I want or this isn't, I, I won't pay for this, which we have had happen. Right. We've put stuff out, we've showed someone and they've been like, that's cool, but I'm not gonna pay for that. Like that's not enough to get my checkbook out. Um, that is a valuable chunk in my mind. Like that's a valuable MVP, that's a valuable thing to have happened because we didn't spend a year and a half to get there. like, I'd rather know quickly that we're wrong. Um, so I think that response is worthwhile.
And then I think the other thing is the excitement, right? So when you demo some of these things or you show them to current customers and you're like, here's where we're headed with the reporting and here's what we're thinking next. It doesn't mean that that next thing has to be there. It's like, here's a little bit of the base. If that gets them excited about what we're talking about next, like connecting those things, then I feel like that's always kind of our indication we're on the right path. Um, and then looking at that across all those different personas to say, you know, does this scrum master get excited about the same thing the engineering manager gets excited about? I think also helps us connect some dots to see if we're on the right path or not. Um, kind of like for Mon from a systems perspective was solving something for the whole organization instead of just one person being the champion for the tool. Cuz I think when you're thinking about SAS adoption, um, if it's just one person yelling loudly that we need this thing, it's gonna be a lot harder to get traction. So we have to think more holistically about is this are multiple people in the organization getting what they want out of what we're delivering.
Melissa:
Yeah. How'd you go through the process of thinking through that systems? Um, you it's, it's a really good conversation about, you know, buyer personas versus stakeholders in it and the users and, and trading up on it. And I feel like a lot of, um, a lot of product managers on a two-sided, you know, system or a SaAS system like this, where they might not be the buyer, uh, they struggle, they struggle with like how to think about the user versus the buyer versus everybody else. So what was your process and who, what does your ecosystem look like for the personas and how do you tailor it to that to them?
Colleen:
Yeah. And, and I think that this one presents kind of an another layer and that the buyer and the person bringing the tool in are often odds with what they want. Right. So our, our, the person typically bringing us into an organization is often a scrum master or an agile coach. Who's looking for a way to run better retrospectives, right? So they bring us in and set up a trial, usually with one team. Um, but they're often not the person writing the check. And so I think that's where we started thinking about what is the other value we bring into an organization and who is that buyer, right? The person buying it is often a CTO or, um, a director of engineering who you have to convince why a retro tool is needed. And so their lens for that is gonna be very different. And in a lot of cases, they're not a fan of retros, right?
We, I, I mean, let's be honest. So many development teams feel like it's an complete waste of time. They're not getting value from it. They're not, um, seeing improvement out of it. And so what we really wanted to focus on was that specific problem, we can create a great tool to gather data and report on data. But if your team isn't getting value out of doing that, none of that machine learning is gonna address that. So how do we make the tool really attractive to developers so that they can add cards from slack or make notes of things out of teams, right? Or what are the tools you're in every day, we're focusing now on like pulling in integrations of data from GitLab or GitHub or Bitbucket, like how do we suck all that stuff into your retro so that we're making your retro more productive because at the end of the day, that's what that's for. So kind of backing up a little bit and focusing less on, um, maybe what we would think is a like voting and grouping right of less on those features, but how do you make your retro more valuable? Um, and so that I think requires thinking more about like scrum masters might bring us in and engineering managers might write the check, but if the teams aren't excited to use our, none of that matters,
Melissa:
You, you mean, um, you know, retrospectives are so valuable because they start surfacing up where a lot of the issues are in the organization. And I know you're selling to engineering leaders, but I feel like product leaders need to understand this too. Right. And when we talk about product operations, um, it's funny when my CPO students was like, oh, you never talk about product operations on your podcast. I feel like I beat this to death, but I, I love this topic because, um, it is about gathering the data and trying to figure out, you know, can we deliver good software? And if not, like what's happening. So when you are doing these, um, these retrospectives and gathering the data, what types of things are you sending to the engineering leaders? Like what, what types of data are they actually looking at and how does that tell them about whether or not they can deliver? Because I feel like that's important for a lot of the product leaders who are listening to this to understand, because they're not having the conversations they should be with engineering leadership, like it, you know, the best idea go through a system. If we can't deliver on it, or we don't have the capacity for it, how could we possibly, you know, even execute on our strategy. So, so what do you do to help engineering leaders understand their data? And like what types of data have you found really resonate with them?
Colleen:
Yeah, so I, we break our data kind of into two separate categories on the reporting side of things. We look at action item data. So it's a cumulative flow diagram of, are you closing out your action items or are you just ratcheting up open ones that you're never gonna do anything with cuz that's, that's not gonna help anybody improve. Um, we also look at average time to close action items, which I think is interesting data. If, you know, depending on if you're doing your retros two weeks or monthly, are you closing them out kind of in a similar amount of time or does it take you six months to close one? Um, so that's always an interesting data point, but then the other data that, and the team pulse report is really the context of what the teams are discussing. And so, like you said, we break, we do a couple different things with that.
We tag themes. And so that's O you, you can do that automatically, or you can override the, the tags that we attribute to the, to the cards. Um, one of the categories in there is product. So we have product people, process tools in tech. Um, and so you can go through and say like, we have a really high volume of cards that are talking about product or maybe talking about process and are they more negative or positive in tone? Um, what's really interesting, I think is to, um, and I, when I'm running retros in the tool, I often will look at the sentiment and the themes and ask the team to pull up calendars or I'll look at my calendar for that team and say, Hey, it looks like, um, at the beginning of March, our sentiment was really negative. And our conversations around process were really negative.
What was happening for us that first week of March that might have attributed to that. Um, so you can start to connect some of the dots between what was going on for the team and what, you know, what the conversation looked like. Um, but I find that the, the sentiment is really what engineering leaders. And I think if you think about the world we're living in now, where everybody is remote, the ability to walk down the hallway or walk over to an area and your, on your engineering floor and your own team say, you know, oh, you know, we're struggling with our unit tests. And, um, we can't seem to get, make time for these. We're always rushed and we're adding more tech debt into the system. And then hear that same conversation when you walk another 20 feet down the hall, um, is gone.
So everything has to be very scheduled and purpose. Like you have to have a purpose to get everybody to talk about tech debt or unit test or whatever the case may be. What we're trying to do is give engineering leaders a little bit of visibility into that before shit hits the fan . So how do you start to get like, you know, eventually I would love to have predictive analytics in the tool. So it's saying like, based on, based on your team's output and this, these keywords and this sentiment, you're at risk for losing an engineer, you're at risk for whatever. But, um, right now what we're trying to do is just help connect those dots because, um, there, isn't really a way to do that if we're not physically in the same place anymore. And so I think you're, you know, you're missing some of that, um, overlapping conversation and organic opportunities to identify those patterns, cuz we're not all in the same place.
And so that's really what we've been able to help, um, engineering leaders latch onto and same thing, you know, from a product perspective, we, we have the ability in the tool to scale retros, if you can do like program or portfolio retros. Um, I find that product folks really love that feature the most because it, if you're a program manager or a product manager with multiple teams that you're working with, you can then look at reporting for three teams at once to say, all three of my teams are all struggling with this same thing, you know, around work breakdown or about getting more vertically sliced user stories or whatever the case may be. But you can see that all in one place instead of trying to connect five different retro outputs together to figure that out.
Melissa:
Cool. Yeah. I could see that definitely being helpful, um, for, for, you know, chief product officers and engineering leaders alike, um, you know, the, the data I find that we can get from a lot of the tech processes and what we do there is just so valuable. Uh, but we take it for granted a lot with products. So I love that you're bridging that gap into kind of what I see is the engineering side of product operations and things that we could learn up from there. So with, uh, scatter spoke and you going forward, right? Like as a chief product officer, um, what's next, like now that you've kind of gone out there, got a little bit more curious, started talking to, you know, your, your customers and finding its engineering leadership, uh, what are you really concentrating on going forward to make sure that you don't fall back into the trap of being the subject matter expert and what would be your advice for people, you know, moving into this new role?
Colleen:
Yeah. Um, well I think the future, at least where we are right now looks like even more things that I don't know anything about. and I am trying to look at that with excitement, right? We're talking about integrations with a lot of different development tools, security tools, um, DevOps tools where, you know, maybe it's triggering, uh, retro from a, a security breach or an outage, right. To talk about what went wrong. Um, these are all things that are very outside my wheelhouse and, and I'm excited, I'm nervous, but I'm trying to approach all of them from a place of curiosity. And what can I learn about that problem that helps me understand what other problems could be solved? So, you know, um, we were talking a little bit before the call and I was mentioning like doing a bunch of integrations with Atlassian. I know Atlassian's tool suite, well, I've used it for 20 years.
so it's like, that is very close to me. Um, I'm trying to go into all of the integration conversations around different places and touch points with JIRA to integrate looking at it less from I'm an expert. And I know how JIRA, I know what JIRA data we need to bring over instead to, how do we use this as a model to go integrate with all these other tools, which I know nothing about, right. Or I don't know what data we need to bring back, but looking at it as kind of that like small chunk learning opportunity, I have an opportunity to try this maybe with something more familiar, but how do I use this as a way to, um, understand what problems exist with other tools and what data's gonna be important to bring in. Um, and I think that's my recommendation to every product person out there is try to get out of that expert seat, um, that, you know, it's, it's comfortable.
I, I'm not gonna lie. It's easier to be the expert than it is to try to try to go learn all of this. And, and I think it's almost less about learning all of it and more about being curious, um, and trying to find different angles and different perspectives from your own, um, and, and get really, um, comfortable not being the expert so that you're out there trying to find, you know, find different voices. Um, you know, I think we try, we all, it's, it's easier. It's easier to have the answer than it is. Like you said, to go to a team and say, you know what? We talked to 20 different organizations and here's all the things we heard let's review that data instead of me telling you what we should build. Um, and I think that gets, you know, that gets the team more excited too. It gets them more bought in. They wanna have those conversations, you spend a lot of time hiring really smart people, let them be part of that conversation too. Cause I think you're gonna have a more engaged team if you get them, um, involved in that instead of just telling them what to build.
Melissa:
Yeah. I love that. And so one question, uh, startup founders on here too. I know we don't, we don't talk about startups that much on this podcast just because it's not really where I specialize specialize more at scaling. So you'll be good to pick your brain on it as a founder, too. Um, you know, a bunch about product, you know, a bunch about technology, uh, you are the chief product officer right now. Uh, biggest question we ask, when should you consider hiring another product manager? So for you, as you're looking about the trajectory and you know, what's going forward, um, what, what would be the signs that you're gonna look for when you say I need some help and I wanna actually bring somebody in?
Colleen:
Yeah, I think it's gonna be more about personality than about expertise, right? And so I, what I don't wanna do is say, oh, we need somebody with an engineering management background to come be a product manager for us, because I think that puts them in the same boat as me, where they are gonna feel like they need to have all the answers for that, what we're solving for that persona. Um, I think what I look for more is that curiosity, um, the process that you go through to take something big and break it down and understand how do we put those small chunks out that we can learn from, um, that to me, when I'm hiring for product folks, that's far more important than expertise in a specific area is, is how you approach that, um, that process. And then, um, I think willingness to be wrong, right?
I think when we first talked, we said, what we all need as a project manager, product manager is to get really great at being wrong. so, and get comfortable with it. You know, same thing of, we can ask a million people for feedback and they might tell you no, but the five people that give you feedback are gonna help, help you shape the direction of where you're going. And I think the same thing is true from, um, from a product management perspective is how do we get really great at being wrong or, or not having Nate answers so that we're going out there to, to learn from other people and ask questions and be really open-minded about what we're hearing instead of feeling like I've got a solution and I need you to prove my solution wrong. Um, I wanna hear what things you're doing now and really be open-minded to where that could take the product next.
Melissa:
Yeah. When you were just talking, I started thinking too, um, these are all things I know that you talk about to large corporations and I've talked in large corporations. Um, and I get a lot of pushback sometimes that are like, Hey, we're a large corporation. We're not startup. We can't think this way. Um, and it's just dawning on me. You are literally across both of the spectrums now, right? Like you have done the fortune 100 companies, you've done all the transformations inside there. You've worked with their CEOs. You've worked with their exec team and their product managers. And now you've got a small startup on this side. What similarities have you found between how people should be doing product management in larger corporations like that? Like what would you bring into that to help? And maybe where are the differences? Like how, how do you kind of compare and contrast those experiences as it comes to really building product?
Colleen:
Yeah. Um, you know, I think one of the biggest, you know, the, so starting with the, what can be the same, whether you're a five person company or 5,000 person company, right. I think, um, trying to start every product with what problem you're trying to solve instead of a solution, I think is super important. And it's super hard to do when you have a roadmap of solutions in front of you. And I would say, that's the thing. I think every organization can get a little better at, you know, I, I feel like there's this false sense of reality. When we put, build out a, a roadmap for the year of everything we're gonna build, um, it's like a, everybody feels better having that in front of them, but everybody knows it's wrong. everybody knows. As soon as we start working, we're gonna be off schedule.
We're gonna get feedback, we're gonna change direction. But for some reason, we still go through that process. And I think that's one of the things that every big company can learn from the startup mindset is, um, you know, let's start with a problem to solve, but let's not try to have six months worth of, of features slotted out for, to deliver. Because the feedback we get from the teams may be very different than what we think we need to build next. And so I think building in that flexibility into your organization and understanding that that Del being able to, um, switch direction is far more valuable to you as a business than having a roadmap planned out for the year. And I think, you know, when we say that people nod people smile, but everybody just goes back to doing the same thing with planning, planning out, you know, a year or two years worth of work. And, um, that's the thing I see so many big, um, organizations really get stuck in that. And then you're just chipping away at a roadmap that's never gonna change, even though we know it might not be the right thing for us to be working on.
Melissa:
Yeah. I think that's huge. Like what you just said is to me, one of the most important, good, good way to tell if you have an effective strategy, like, can you change it? Can you go in a different direction? What, um, types of things would you recommend for these larger organizations to really bake that into their planning processes or strategy processes? Like what are, what are some of the tenants that you've seen work, um, while at scale there?
Colleen:
Yeah. I mean, I think we're seeing a great shift in the industry already, just focusing on the, the whole concept of outcomes over outputs. And I think that's a step in the right direction. Um, but you can still, I think have a very rigid plan for what you're gonna deliver, even focusing on outcomes. And so it's like it's kind of lip service for a lot of businesses that are like, oh, we're gonna focus on outcomes, but here's our roadmap to deliver those outcomes. Um, and so I think, you know, that that's one piece of it is trying to focus on the outcomes and say, here's the, when we've, when we've achieved this, then we know we're done with this piece. But I think the other part of it is, um, leveraging things like Kanban for product and saying, you know, at a pro at a, at a portfolio level here's options of things we might consider next, but we're not gonna commit to one of those until we've delivered something and learned something.
So really creating this, this giant feedback loop to say, as these small chunks of things go, and we can get feedback, we're gonna use that feedback to pick what's next. And it's not to say that you're not planning or figuring out what those options could be next that still has to happen. We still have to be doing some upfront work. So there's not these long lulls in between deploying and starting the next thing. But it's saying like we're gonna have enough options available that we can learn from what we've deployed. Look at those options and pick the right best thing at the moment. Um, because I think when we try to stack all of that in, um, we're not giving ourselves that flexibility and I love the way you said that Melissa, I wish more organizations would, would, um, weigh or grade their, their agility or their, um, the health of their product system based more on their ability to change direction and less on their ability to follow a process.
Melissa:
Yeah. That's for a lot of the organizations out there to listen to. Well, thank you so much for being on the podcast, Colleen. Uh, when, where can people go find out more about scatter spoke and yourself?
Colleen:
Well, we'll be in Nashville, I think with you. Yeah.
Melissa:
Agile 2022.
Colleen:Yeah, definitely. I'm looking forward to it. Um, so we'll be there at the end of July, but you can always learn more about scatterspoke at scatterspoke.com. Um, we're on Twitter, we're on Instagram and, um, we're always looking for feedback. I mean, we learn best from, from even the, even the angriest feedback helps us grow as you know, from a product person. Sometimes it stings man, but
Melissa:
It does, but its helpful.
Colleen:
It's it's helpful. Um, and that's the, the best way we can learn. So, um, we welcome all the feedback on the, on what you see there and what you wish was there. And um, we really do take it all very seriously. So, um, check it out and let me know.
Melissa:
Great. Well thank you so much for joining us and for those of you listening, um, you can learn more at productthinkingpodcast.com. We will have all of Colleen's links on there. So you can go find out more about, uh, her and scatterspoke and make sure that you subscribe to this podcast. If you liked it every Wednesday, we do another episode next week. We'll follow up with your questions, um, on the dear Melissa segment. So if you have any questions for us, please go submit them to dear melissa.com and we'll see you next Wednesday.