Episode 182: Balancing Stakeholder Needs in Educational Products with Amir Kabbara
I recently spoke with Amir Kabbara, a product leader with experience at Microsoft, Amazon and Shopify, who currently serves as Advisor to the CEO at Paper. Amir is an expert in product management and he and I discussed at length the ethical concerns surrounding the use of artificial intelligence (AI) in the education sector, among other topics.
We also discussed the critical role of data-driven decision-making at Microsoft, the significance of customer conversations at Amazon, and the application of Shopify's frameworks in shaping product strategies
Read on to discover more of what we chatted about in the recent episode of the Product Thinking podcast.
You’ll hear us talk about:
15:40 - Balancing Stakeholder Needs in Educational Products
During this conversation, Amir addresses the challenge of balancing the diverse needs of multiple stakeholders in educational technology. When the end-users (in this case, students, teachers, parents and administrators) all have different and distinct needs, it’s crucial to have some aligning North Star that interconnects them all, and to have real clarity on what that is. Amir references a lesson he learned in his time at Amazon during a meeting with Jeff Bezos, where the focus was redirected to the primary customer – the children using a particular product. In the context of Amir’s work with Paper, the principle was applied by recognizing that while students are the central focus, their success is influenced by teachers, and in turn, administrators.
17:41 - Integrating Academic Learning into Product Development
At Paper, Amir integrated the Academic Learning Team into the product organization, recognizing that the core product was not just the software, but the tutoring experience facilitated by a large team of tutors. Initially focused on building gamified features to engage students, Amir realized that true engagement and outcomes required an academic foundation. By incorporating academic principles into the product strategy, he ensured that the features built would not only engage students, but also drive educational outcomes.
30:20 - Balancing AI Automation with Human Interaction
While discussing the subject of balancing AI automation and human interaction, Amir outlines a framework for determining when to use AI and when to involve human tutors. For straightforward, one way interactions, such as providing hints during problem-solving, AI can be effectively used. However for the more complex, two way conversations that are a huge part of learning and growth, human interaction and tutelage is still an absolutely vital cog. Amir shares an example where students using Paper’s chat service often reveal personal issues, such as mental health concerns or unsafe home environments. In these cases, tutors are trained to identify and escalate these issues, providing crucial support beyond academic assistance.
Episode Resources:
Other Resources:
Follow/Subscribe Now:
Facebook | Instagram | Twitter | LinkedIn
Intro - 00:00:01: Creating great products isn't just about product managers and their day-to-day interactions with developers. It's about how an organization supports products as a whole. The systems, the processes, and cultures in place that help companies deliver value to their customers. With the help of some boundary-pushing guests and inspiration from your most pressing product questions, we'll dive into this system from every angle and help you think like a great product leader. This is the Product Thinking Podcast. Here's your host, Melissa Perri.
Melissa - 00:00:37: Hello, and welcome to another episode of the Product Thinking Podcast. Today, we're talking about building complex products for multiple personas and the power of AI in education. We're joined by Amir Kabbara, who's the Chief Product Officer at Paper, the K-12 learning platform aiming to increase student confidence through their educational journeys. Amir joined Paper towards the end of 2022 after almost two years at Shopify, eight years at Amazon, and five years at Microsoft. I'm excited to dive into everything about the education space with Amir. But before we do that, it's time for Dear Melissa. So this is a segment where you can ask me any of your burning product management questions. Go to dearmelissa.com and let me know what you're thinking. Here's today's question.
Dear Melissa. I am looking for a good framework to validate jobs to be done. We do a good job of using interviews to draft our jobs to be done, and then do follow-up interviews to refine them. But I'm looking for a quantifiable way to compare the results of these interviews.
Okay, so when I read this, how I'm interpreting it is not that you're looking to validate the user problem here, but looking to validate that it's actually an opportunity for the business. In a lot of companies, we're using jobs to be done as the primary focus for how we build our products. So we're out there, we're going to do the user interviews, we're discovering the problems, and we're saying that our main jobs to be done for this feature or the software is X, Y, and Z. If you're doing user interviews and you're coming back and finding out that, yes, this job to be done is a problem, I validated it through a bunch of different user interviews by talking to them, by seeing that's a problem with multiple people, by interviewing a lot of these different personas, that's telling me that you've done it in a qualitative way and that you know it's a problem. But the second part of that is making sure that it's something that you should be solving with your business, that it's an actual problem from a quantifiable objective, a quantitative objective. And I believe that's what you're asking for. So that's how I'm going to answer this. When we go out and look at the jobs to be done and identify them in a persona, we then have to pull them back and say, is this worth actually doing from a business perspective? And what you're doing there is looking for data that tells you that this is going to be a big enough problem to solve to reach your business metrics. Usually those business metrics are in form of closing sales or retention or NPS or anything that really drives your business forward from that overall metric standpoint. So what I would do in your situation is look at how many people that job to be done will serve. So how many of your customers will it serve? What does that segment actually look like?
And by solving that job to be done, what is it going to drive? Is it going to drive NPS metrics, right? Is it a happiness metric? Is it a workflow efficiency metric? Is it a acquisition metric? Because you have a whole pipeline of personas that are coming in from sales, but you haven't been able to close them. What you need to do is connect that job to be done back into the driver metrics and you have to identify what your driver metrics are. So ask yourself by solving this job to be done, what do I think will happen for the user? Will it make them happier? Will it change how they work? And then what does that do for our business? If they can do X, Y, and Z, what do we get? Then you want to go and pull data to see if you can justify that. So I just gave you an example with sales. That's a pretty common one. If you have a job to be done that really affects a certain type of persona or a certain type of customer, or maybe you have one customer, one type of persona, it doesn't matter. And that's a gap in your product and you haven't been able to close sales for it. I'd expect an acquisition increase. You want to say, how many more sales could we close in the next year? Or how many more people can we attract with this, if we solve this job to be done, if we actually concentrate here? You can go get data out of sales in your pipeline to justify that. If it's a retention issue, you can do the same thing. How many people are churning? How many people's contracts are up for renewal, for example? And you might be able to say, model it out and say, if we solve this problem, we expect to retain X percentage of users. This is how many are going to churn. So this is the perspective uptick in our metrics that we want to do here. That's how you pull in the quantifiable piece.
When it comes to jobs to be done, you want to go out and make sure that it's validated from a perspective of how many the users actually have this problem, qualitative perspective on it as much as you can. You want to go out, you want to do your interviews, you want to confirm it. But then, you got to go say, is it worth it to do this? Does it serve a few people? Does it serve a lot of people? That's how you bring that business side and that business case back into the job to be done framework. And that's going to help you actually put it forward to the rest of the business and say, hey, here's our jobs to be done. This is what we should double down on. This is how we concentrate. A lot of this will emerge out of experimentation as well. As you figure out how to solve those jobs to be done, you'll be putting it out there. You'll be getting feedback that way. But I would also look at what those driver metrics are. And that will help make sure it's the right thing to solve. So I hope that helps. And again, if you have any questions for me, go to dearmelissa.com. Let me know what they are. I'll answer them on an upcoming episode. Now, let's talk to Amir. Are you eager to dive into the world of angel investing? I was too, but I wasn't sure how to get started. I knew I could evaluate the early stage companies from a product standpoint, but I didn't know much about the financial side. This is why I joined Hustle Fund's Angel Squad. They don't just bring you opportunities to invest in early stage companies, they provide an entire education on how professional investors think about which companies to fund. Product leaders make fantastic angel investors. And if you're interested in joining me at Angel Squad, you can learn more at hustlefund.vc/mp. Find the link in our show notes. Welcome to the podcast, Amir. It's great to have you.
Amir - 00:06:22: Hey, thank you. Excited to be here.
Melissa - 00:06:24: So you've had a really interesting and illustrious product career leading teams at Microsoft and then Amazon and then Shopify. And now you're at Paper. Can you tell us a little bit about what it was like to adapt your processes for strategy and leading these teams at large corporations and how you did that at a high growth scale?
Amir - 00:06:43: So in each of these large corporations, Microsoft, Amazon, Shopify, I came out of them learning a specific thing. If I think about Microsoft, the biggest thing that I learned is how to not use enough data while you're making decisions. And I kind of remember back then I was working on MSN Messenger. I don't know if any of you use MSN Messenger or remember it or Skype back in the day. It was actually my dream back in high school. I was on Messenger the entire time. It was my dream to go work for MSN Messenger. I went there, ended up working in kind of Microsoft for MSN Messenger. And I ended up being the person who shut it down as well. So that's a full cycle. But the big learning from it is all of the data told us that everyone's using Messenger on the web all over the world and so on. And to ignore mobile. And yeah, when we did that, the entire thing kind of flopped. So I learned how to, when you look at data, to look very cautiously. And then with Amazon, the flip side, really the importance of customer conversations and going really deep. And using Shopify principles and frameworks to think and decide. And these are the three things I really took with me when I joined Paper, which is a kind of smaller Series C, Series D company. A couple hundred employees versus tens of thousands in the larger companies. Those three themes I kind of kept with me because they're really important to really be able to go deep. In Paper's case, it's an education company. To go deep with students, teachers, institutions to really understand their needs. So that's kind of one thing. The other thing I learned in Shopify from one of the leaders, he leads the growth team in Shopify. His name's Archie. He is, one of his methods for leading the team is to kind of keep them updated. He does weekly newsletters for his team.
And they really looked at his newsletters, learned a ton from them. So that was one of the things I took to Paper with me. So as soon as I joined, I started this weekly newsletter thing to go through and share just what's on top of my mind and what am I thinking of? And in larger company like Shopify or others, that becomes really, really important because there's a billion things happening. But what I found is it's also really valuable in a smaller company with a lot less people. Yes, you meet them, you have the one-on-ones and so on, but it's always useful to kind of get into your leader's mind. So I really kind of kept that going and used it on the other side, the way you're making decisions. What I felt was really different in a Paper versus an Amazon. As in Paper, you kind of, sales had a conversation with a customer and then came to you. Like, hey, I just had a conversation with a customer. Let's go build these five, six features. And then typically the product team went and built these five, six features because sales told them. So it was really sales led organization. The one of the kind of big challenges was moving it from sales that's closer to a product led organization and really leveraging all of the frameworks. And really a system in place to help people prioritize, to think through what a sales, when sales comes with feedback, how you use that, how you think of that. When you're talking to a customer, how you use that feedback, how you think of that, putting it all in together.
Melissa - 00:09:56: That's a really big topic for a lot of people, and especially in a product like Paper or a lot of educational products, right? You've got these end users who are the students, and obviously we want to make sure that we help with student outcomes. We've got the teachers that we have to care about. We've got parents involved in a lot of places. But then we also have administrators who are usually the ones buying the product, right? And especially at scale or in enterprise, it's going to be people in bigger institutions, not necessarily just like a local school. It could be a whole school district. I've gotten a lot of questions from people about how do you balance the needs of administrators versus students on the end of it? I actually got this question written in a while ago that asked about, hey, like I work in this education institution and I'm a product manager on a team. I am trying to build everything that goes towards student outcomes, but the sales team is yelling at me to build these features. And our founders are pushing me to actually build all these features for administrators. But I can't tie that back to student outcomes. It doesn't look like those things directly relate. What's your approach for balancing those? And how do you make sure that you are building what is needed for that entire ecosystem?
Amir - 00:11:06: It's a really good question. And it's pretty complex when you have three, four, five different stakeholders that seem to have different needs. There's really two things that I think about and use. And this is something I learned in Amazon. First, what is your North Star? Really understanding and aligning on your North Star. And then second, this flywheel concept, especially when you have multiple stakeholders, they're all somehow connected. And understanding how they're connected and kind of how you want them to be connected for the flywheel to spin the right way and fast. Both of these things are really important. And I kind of remember in Amazon, we're going through and this was actually a meeting with Jeff Bezos. And at the time, we're working with developers who are building apps and games, and then consumers who are kind of playing apps and games on tablets. There are mostly kids or younger folks that are going through and playing. And we were kind of all about going through and helping developers monetize, which is an important concept, important topic. How developers monetize is ads. And what annoys consumers is ads. And if you're thinking about younger consumers, ads and younger consumers don't go together because you don't want to serve them incorrect ads and so on. And we're talking about how awesome developers are and how we're going to help them monetize and use new ad formats that are great and so on. And it just stopped us. What are you guys talking about? Let's go back to the basics. Who are we here to serve? Kind of went from there really focused on the customer and essentially the kids, their parents ended up creating a product to charge parents basically with no ads that still helps developers monetize.
But I remember from all of that discussion, it's going back to the basics, remembering who your audience is, but still serving other constituents that will help you get there. And that's kind of the same thing I thought about in education. You have the student who is spending on your company. And Paper, they were at the center of what we're doing, really driving student outcomes, trying to help the students grow. And what we actually learned is student motivation really varies. You have the students that will go use every single tool, will love it, and will use it by themselves very proactively. And these are usually the students that are doing pretty good, that are motivated, usually doing great in school. And you have many others that need this push. So it becomes important to go and build a feature for the teachers to help the teachers and enable them, enable the students. And then I remember one discussion we had with sales was around an analytics product for institutions. They were going to district and the districts, I have no idea who's using my products. I have a simple dashboard, but I need to spend all of this time going really, really deep to understand it at a teacher level, at a student level. It's very complex. And it's taken me a lot of time. I can't prove the outcomes are there. So sales came to us and they're like, hey, like, this is what we need. This is what the district needs in order for them to renew. And I remember in this specific case, it was at a time where we had a really big back to school launch coming up. So really focused on a lot of student and teacher features. And this kind of came out of nowhere as a really high priority that had to do from the CEO down and really kind of also went back to the basics and ended up having a data person go through to just create a dashboard for that specific district.
And then the salesperson can go and change the district name and they get a dashboard and they can send them a report versus going and building the entire thing in the product. So in this case, we helped unblock the salesperson for the deal because it was a pretty big important deal. But at the same time, truly believe keeping the team focused, especially if they're in the midst of the launch or whatnot is really important. So you kind of balance it, but it was valuable to go and create that report because if the district doesn't renew, guess what? Like there are no student outcomes happening from your company. So it's important to kind of balance them, really understand your flywheel and understand the purpose behind the ask, debate it, challenge it, and then step back, go back to your principles and decide and kind of go from there.
Melissa - 00:15:40: If somebody wanted to try to model out a flywheel for their stakeholders and all the people that are involved in their product decisions or using the product, where do you start? How do you actually do that?
Amir - 00:15:51: First, pick your core user that you're trying to revolve around. So in Paper, it was students. In Amazon, it was consumers. In Shopify, it was the merchants. And it's interesting when you look, especially Amazon and Shopify, when you look and work in both companies, tons of decisions change because of who's at the center of this flywheel. So you kind of have this person. So you kind of have the merchant. And then you ask, what makes the merchant successful? And who are the other stakeholders that make them successful? And then you're like, okay, a merchant needs customers. A merchant needs maybe other tools and features inside of their websites. And you have a bunch of app developers on Shopify that are building those. You kind of work backwards from that. And you always ask this question, who and what makes this core audience successful? And you pick the next one. It's like, okay, what's going to make this next stakeholder successful? And then you kind of go through and try to connect them to each other.
Melissa - 00:16:50: So if it's an education, right, where like, what makes the student successful? It's like, one of them could be teachers, right? I have a great teacher and teachers empowered to make great decisions. What makes a teacher successful? It's, you know, administrators, where they work, who they got trained by, their peers probably around them, the tools that they have. And then you would just ladder it up that way. Is that the way that I'm thinking of it correctly?
Amir - 00:17:11: Correct, yeah. So you'd pick who would make students most successful. It's like teachers because they're with them the entire time. And then for teachers, what will make them successful? It's like, okay, they need to have the right tools. They need to get enabled. They need to get paid. They need to be motivated. Who's doing that? It's the administrator. Then if you look at the administrator, how are they judged? They're looked at with like school scores, the growth of their students and so on. And that's the outcomes. And that's going back to the students. So that kind of completes your flywheel.
Melissa - 00:17:38: That's a really nice way of thinking of strategy that way. I like the flywheel concept. When you're looking at this whole landscape of users as well, at Paper, you oversee a lot of different aspects of getting that product right. It's not just product management. It's also user experience. It's data science. And you're overseeing the academic learning piece as well. I feel like in some organizations, I've seen them separate out these different areas where academic learning might not fall under product management. It will fall under content or data science will fall under technology. How do you think about combining all of those different facets to make a great product? And how do you oversee all those entwining strategies?
Amir - 00:18:18: Yeah, so when I actually started in Paper, I just led the product in the UX team. And we kind of started building features that are, you know, we go talk to students and students want to play games and we're like, let's play, let's basically build game-like features. That made sense because it's fun and so on. And then what I realized not coming from education, someone who has been there, it's probably obvious for them, is that the intricacies of going and building a game or building gamified experience, that helps you drive engagement, but doesn't necessarily help you drive outcomes. And that's where the academic learning team came in. It was a couple of people. And one of the things that Paper has is pretty large 2000 to 3,000 person tutor, like human tutor organization. And the academic learning team was in that organization, helping the tutors grow and learn to be able to serve the students best. The thing I realized is the product we had is actually not the software, not the game-like software we have. It's actually the tutors and the experience that the students have with the tutors. And at the center of that, what drives the tutors and the way they teach and help the students is this academic learning team.
So it was a really important, I thought, for the product organization to own the product and have the product. And the product in this case is this kind of academic learning team because it's what's driving the strategy for everything down. And with that, to actually put that team, grew it, formed a proper academic learning team, had a really strong leader who was a teacher, but also went deep to really understand how you teach students and did research on that topic and so on is probably one of the kind of best in the field. And again, it's not my background, not my experience. So really leveraging his skill set to help think through the strategy, starting with, from an academic perspective, how should we think about things? And then we look, okay, separately, there's these other concepts and product that you take advantage of and so on, but really starting with this academic-based principle. So that's why I kind of thought it's really important to form this one team that had academic learning inside it.
Melissa - 00:20:42: That's a really great way to position the product there. With the academic learning team, are you using them as well to get any feedback loops in from the tutors or from the students? How does the software product management aspect of the business that you oversee interact with those academic learning people?
Amir - 00:21:02: We started with, there's an academic learning team, there's an impact team, which basically just measured the impact of how students are doing in student outcomes. And then there's a UX research team. And we ended up combining all of these three into one team. So typically, a lot of the deep kind of research would happen with the UX research team. But by combining them, you kind of make sure, kind of what you said, to make sure that happens. One of the things I truly believe in is the value of a product manager is understanding the customer. So talking and having conversations with customers doesn't really stop with this academic learning and research team. For every product manager, it's important for them to go through and spend kind of most of their time actually talking to customers. In this space, especially when you have kids and teenagers and elementary students and so on, it becomes tougher to talk to customers. And so one of the things we do is, you know, there's a school everywhere, there's a public school everywhere. And we're lucky to have customers in almost every City and State. So the product managers just go and watch the students in the classroom. And that's a very different thing than trying to do surveys or doing something else, because you're able to go deep and understand how they're feeling and how they're thinking when they're using your product. It's not the only thing they're thinking about.
They have a billion other things happening. They're going in between classes and they're seeing a substitute teacher who has no idea what your product is. And there's just a lot of stuff that happens in the classroom. So I found that really valuable. Again, expensive because you have to go down, but totally worth it. The other thing is there's a couple of tools. I'm blanking on the names. One of them is user testing. Basically, they go younger as well and they kind of get parent consent. So we've been able to also do that where every single week we have kind of an ongoing thing with two students where a product manager can come in and just chat with the students about whatever they're kind of working on. So we try to keep that loop. So in addition to being there in person, every week you can kind of go in and chat with students. And then we expanded this program to teachers as well, where every single week you can go and kind of chat with teachers and show them what you're working on, brainstorm, so on. And that became really valuable and kind of a differentiator for us, because it's hard to build. It's hard to get parent consent for younger students and be able to do it that way.
Melissa - 00:23:29: Did you know I have a course for product managers that you could take? It's called Product Institute. Over the past seven years, I've been working with individuals, teams, and companies to upscale their product chops through my fully online school. We have an ever-growing list of courses to help you work through your current product dilemma. Visit productinstitute.com and learn to think like a great product manager. Use code THINKING to save $200 at checkout on our premier course, Product Management Foundations. That's really neat. Yeah, I find a lot of people have trouble with that recruiting aspect, especially something where you can keep it steady for every two weeks. Is it like a firm who does that? Or do you have somebody on your user research team who's really doing that?
Amir - 00:24:14: Yeah, one of the folks on the user research team started that program.
Melissa - 00:24:18: So they do all the recruiting and then they make sure that there's going to be somebody every two weeks and then you can plan what the research will be for that week.
Amir - 00:24:24: Correct.
Melissa - 00:24:25: Meetup used to do something like that back in the day too, I remember. They always had somebody scheduled to come in on Fridays and any team could send them, hey, this is what we want to learn this week. This is what we're trying to do. And the user researcher would go through and make the whole interview guide or whatever it was going to do to do that. And people could watch in and do you have the product managers go themselves or does a user researcher lead those conversations?
Amir - 00:24:47: The product manager does. That's what I really push on and ask. And it was a tough lesson that I learned back in Amazon. Anecdotes are really important. So there's a difference between data, anecdotes, and real conversations. And I remember going in to one of the meetings in Amazon with our Senior Vice President then. Her name is Christine Beauchamp. And her background is in fashion. She was, I think, Victoria's Secret's CEO, worked in Ralph Lauren. It's a very different world than Amazon, what I'm used to. And the emotions in a customer conversation are different in that world versus in Amazon, which is a bit more, I guess, transactional for consumers. I went into a meeting and it's going and building a bunch of features to overhaul the product page in Amazon to add more immersive experiences, videos, and a bunch of things. And all of this data and why it's important and blah blah blah. And she's like, so I prepared this PR, like press release and FAQ, long document, takes a lot of time. And her question was, did you chat with customers? I'm like, yeah, our UX team went and chatted with customers. Here are a few quotes. And then she kind of picked one of the quotes. And she's like, how was this customer feeling when they told you this? And I'm like, I don't know. Like, here's the quote that the UX research team gave me. And long story short, she's like, go talk to a customer. And having this customer conversation, them saying the same thing. That like, in a quote looks like they love this product and like they want you to solve the problem. It turns out they kind of like mentioned it in a minute in the conversation. And the thing that really, really, really frustrates them was buying things that are white. Because when you're buying a shirt that's white, you actually don't know what shade of white it is.
And every time you get it, you're like, oh, it's not matching my pants, which are also white. And that was the thing that like frustrated the most. And that's the change they wanted in the product page. Like, tell me what white is. So I learned from the conversation, the emotions that come out of a sentence is different than what you see on Paper and different than like a data point that you see. So yeah, we ended up going and building this really complex AI feature that figures out the shade of white and making it very clear and telling the user, you bought these white pants and these white pants are the same as this other white shirt, it's the same shade and so on. It was kind of a fun, complex problem, but it solved a real issue that consumers had. So that's why now my mindset is going to go talk yourself to a customer. And the emotions you'll get out of it are very different than what you'll see in like a quote on Paper. So definitely UX designers, product managers. We've been trying to get engineers closer to the customer. We're still working on it. Not super, we're trying to figure out the best way to do it. I do think there's value side note, one of the things working on and building now using AI is customer-centric product assistant that basically goes and there's a billion conversations that sales actually has, that product managers have, that UX designers, UX researchers have and so on.
And usually you have the conversation, then a few weeks later you forget about it. And there's tons of these conversations across an organization. And they're usually always kind of hidden, especially if a B2B team's using Gong. Gong's really expensive, so not everyone has access. So basically one of the tools that's going through and picking all of these customer conversations, getting them, synthesizing them. So anyone in the organization is able to easily tap a customer conversation. So imagine an engineering lead who's writing the architecture document for how they're going to build the future. As they're writing that, they'll get customer conversations surface that are related to the feature they're doing, including the kind of video snippet of the customer in a sales call that's mentioning this thing that's annoying them. So they're able to see it and also feel it and kind of feel the pain of the customer as they're doing that. And that's one of the tools I'm building and excited about to solve this problem because I truly believe, kind of talk about sales-led, product-led. Really, we kind of need to be customer-led and understand the customers really well and what they're saying and what they actually mean. And I feel like the Gen AI can really help with that today.
Melissa - 00:29:12: That's really cool. I've been seeing a lot of products come out like that all over and everybody's doing it in a slightly different way. I like what you're saying because it's bringing it back to the team member in the moment of what they're writing. I've seen it where some of them will aggregate all these conversations and then surface insights at a higher level. It won't be super specific to what somebody is working on, but you could query and ask it questions and it would find what conversation and what customer is doing that. I know Dragonboat is working on trying to get things out of Gong in Salesforce and other places. And they plug it back into product strategy to help you show, hey, these problems that we're solving over here are enforced by these customers. Zelta AI is doing it with sales conversations and then they're tying it back to pipelines for opportunities as well. I think this space is really interesting because now that we have these capabilities with generative AI and the LLMs, it gives us so much more power to sift through all of these conversations and you tie that back to sentiment analysis, which is what you started getting me on. Now we can actually hear tones or it's not just generic insights. You can actually see what people feel about that. So I think that's really neat.
Amir - 00:30:17: Exactly. I'm super excited about all what to come there.
Melissa - 00:30:21: AI is a really big topic in education too, especially with the ability for people to go on ChatGPT now and ask questions or get help or break down how to solve problems when before you had to go through a textbook or talk to a teacher, it gets a lot more concrete. How are you thinking about AI in the education space and what kinds of trends are you seeing out there?
Amir - 00:30:44: One of the products that's a core product of Paper is a student comes to Paper and chats with a tutor. Or they can submit their essay and kind of get feedback. And Paper has 10 years worth of kind of data of conversations with tutors and students, chatting and learning and getting feedback on essays and so on. So we're in a really unique spot because through all of these years, we've had a rubric for what's a good tutoring session and what's a bad tutoring session and where are students learning most? And that's how the tutors are evaluated. They're evaluated based on this kind of rubric that takes into account the actual kind of topic and the accuracy of the tutor, but also things like, are they encouraging? Are they positive? Are they helping the student? Are they just kind of giving the answer? Are they teaching them seven, eight dimensions to this rubric? So one of the, as Gen AI, as we thought about Gen AI, one of the things we really wanted to make sure is the bar still the same? So we actually used that same rubric. And we went through and tested LLM responses to chat and to essay feedback based on the same rubric. And we realized how poorly performing it is and that we can't just put it in front of students. So what we thought about instead is using it as a tool to help tutors be more efficient to better and better serve the students. So the first things I think about Gen AI in education is pick your bar and maintain it. So in our case, it was the same bar that tutors had. We wanted to continue to give that to students and wanted to continue that same quality. It's pretty cool because now we built an automated evaluator on every single session where we're able to see what are the areas that tutors are really best at and what are the areas where AI is actually doing a good job in.
And then the areas where it's doing a good job, it's like, okay, great. You can let the tutor know that this is an area that the feedback is pretty good. So they can take a quick look and ensure. So I'll kind of give you an example. Students submit an essay. The first stab kind of AI takes at reviewing that essay and providing comments. And then the tutors look at it and they kind of either approve, edit, or reject comments. So the areas that we know every single time AI is kind of getting right because tutors are approving it every single time, that we can give to students directly and give them feedback pretty quickly instead of waiting for the tutors. And there are other areas like take, for example, math for high schoolers that are submitting a lab report. For that, AI is not kind of great at. And you're able to note like, okay, in this case, make sure the tutor goes through and reviews it in detail. But all of that feedback from tutors we're able to use to improve AI over time. So it's kind of just take a balanced approach and make sure you know what this North Star you have is and what your bar is. And I think that's really important to keep and maintain as well.
Melissa - 00:33:53: In this case too, it sounds like you first approached it where it said, hey, can we automate this and remove a human component? It sounded like that might have been the first step, but then you realize that, hey, AI is not quite there yet, or this is not really good at just removing this fully. How do you think about the balance between automation when it comes to AI or that human component of where we bring somebody else and AI can help them and enhance that person versus what's actually a good thing that can be automated or taken away by AI? How do you gauge when you need humans as well?
Amir - 00:34:29: The framework in my mind is if a student is coming in, like having deep conversations, you want to have the human in the loop. If the student is working on a problem and they're just getting quick support, AI can quickly support them. And the difference between that, the easiest way to see it is, let's say a student is working on a math problem. As they're working on a math problem, they're kind of stuck or getting it wrong. You can surface a hint for them to be like, consider this topic, consider this concept. Did you learn this concept? And that's a one-way conversation. It's just AI giving you a can't have a conversation back and forth versus the full-on conversation back and forth. And why that's important is in Paper, one of the things we see is there are a lot of students that jump on to chat. They start with a math problem, but then the conversation and the way they're speaking, we realize that there's a bigger problem with a student's mental health. In numerous cases, actually, we've had students come to Paper, start with a problem.
And then tell us that something at home was happening that was really disturbing and they're not safe, as an example. And with those safety concerns with Paper, the tutor realizes that there's escalation paths. And in multiple cases, we've actually helped students really saving their lives from either a conflict at home that's happening or a student that was about to commit suicide, really bringing them in the help. So you have a responsibility when you're in the sector. And you kind of need to take it seriously, especially in these two-way conversations where the student is looking to talk to someone. So it becomes an opportunity, not only to help them academically, but to keep an eye out on other things. So with these cases, it's just, I'd love for AI to help students all the way in itself. But in these cases, it's just really important to have the right escalation paths and to just have the right bar. Because it goes beyonds of helping the students with learning math concepts into other areas that are important to keep in mind.
Melissa - 00:36:37: I think it's so important that you bring that up and that we do talk about some of these ethical concerns. Because I think if somebody got into the field that you're in today, we'd really think just about student outcomes when it comes to an academic perspective. But you bring up this whole nuance that student outcomes aren't just due to the way that you learn or the way that somebody has taught you in the classroom. It's got to do with your whole system and how teachers and tutors can help intervene in that and really figure out what is the bigger problem here. Is it that they can't learn? They don't understand the material? Or is something bigger that's going on? So I love that you come back to that human component there to help make sure that there's a nuance, right? You can actually see the nuances between those things, which I think is really important. How do you think through all of these ethical concerns when it comes to AI, too, and especially in education? Like here, obviously, you need to be able to understand the student's perspective and their background, where they're coming from. How do you, when you're evaluating when to use AI or when to do something else, say, hey, there might be ethics involved here, or this might be something that we actually have to take a step back on and actually reevaluate? What do you do to encourage your teams to think about that?
Amir - 00:37:49: I think it really starts at a principles level to make sure that teams understand the principles that are used to build the product. So we have a chat experience, but there's also a video experience that tutors can get on a video with students and chat with them. So a lot of the same kind of safety, ethical conversations that happen, all of the checks that tutors actually need to abide by. There's a lot of mix of robust systems, auditing and checks that we have in place. So it's kind of a bit lucky because before AI, we've had these things in place for the product experience that we've had. So every product that goes through that kind of launch and use has the same set of checks that happen and again, kind of the same bar that we're evaluating it. So as we think about the team, they're trained to know that these things exist and are able to use them then.
Melissa - 00:38:44: Do you do anything from like a goal perspective as a leader too? I've seen a lot of situations where leaders will put out these business goals or whatever outcomes they want to hit, right? And they're extremely either financially motivated on that area or they're looking at maybe one goal in a system, right? Like increase engagement. Great example. And some more junior product managers, I think people who've been in the industry for a while know not to do this, but more junior product managers or people without the experience, they might go after that at all costs and not really consider the system around it. Or right like, what could happen? What bad thing could happen to a student or what might not be in their best interest? What do you do as a leader to help enforce the best outcomes for the students from a whole and not get people just so laser focused on like one thing?
Amir - 00:39:29: It's a tricky question, because all I speak about is to focus on one thing and the importance of focusing on one thing and one goal and so on. And then kind of use the principles to help guide how you build and how you get to that goal. But Shopify, Kaz, who's Shopify's COO, who I work for, one of the things that he's used that I kind of learned from him is a counter metric. So you basically have your North Star and then associated with your North Star, you kind of have a metric that you don't want to get to. It's like this cannot happen. So you need to reach this goal without this happening. So you kind of use that. You basically say, okay, you need to drive, let's say, engagement. If that's your North Star, you need to drive engagement. And you need to do it in a way where the outcome doesn't go below X. So in our case, this essay review product that we have, actually, this was the chat product. One of the things that tutors are able to do is go through and they can at times chat with two or three students at the same time. So we have efficiency metrics as well, right? The efficiency metrics tells you, it's like, how much are you spending to serve students? And we have outcome metrics and so on. But for the team that's really focused on driving efficiency, they need to figure out how to create these AI tools to drive efficiency. But the counter metric for them is they cannot have tutors just go and chat with 10 students because they're going to have a terrible experience. You can't like, you can barely keep up with three students. The sweet spot is two students. One student solving something if they're working on the same topic. So you kind of have this counter metric. That's what we have. And that's what I learned in Spotify. Typically, the principles should do it. But the counter metric is an extra check that you're able to actually track and more quantitatively look at versus principles that people might forget about and so on.
Melissa - 00:41:20: When you're looking at AI in the landscape of what's going to happen in education, what are you keeping your pulse on? What do you think is going to happen over the next couple of years?
Amir - 00:41:27: I'm excited. I think a lot of the big tech companies are loving education, or at least there's a lot of good and bad press. The easy good press that big tech organizations can get is to talk about AI and education and how it's helping and how it's whatever. So they're kind of getting into it. For a lot of the basic scenarios, AI is going to serve students for free. So I'm kind of excited about these basic scenarios. And that's going to get more and more advanced with time. And I think where a lot of edtech companies, they're going to go to higher value products. So I'm also excited about really like edtech companies getting pushed. So we in Paper have been doing this kind of chat tutoring for a while, and we've resisted a bit from video-based tutoring. And now we're like, actually, we want to go video-based, takes you up a level, it differentiates you. And what we realized after we tested is the outcomes for students are much higher when you're going and having these sessions about a certain topic and students getting tutored. So a lot of companies are going to get pushed upwards to create things that are higher quality that AI can just replicate. Otherwise, they're all going to disappear. At tech companies with AI taking over, I do think it will take time for both the technology to get there. And it's happening really quickly, but also for districts and the education sector to adopt it. When I look at a lot of districts and talk to them about AI, there's a lot of openness to try, but resistance to actually try. There's like openness in the company. Like, yeah, let's test blah, blah, blah. And then when it comes to it, there's just a lot of resistance. And I think it will take time for both behaviorally and the technology to get there. But if you're building something in edtech, just assume AI is going to take up a lot of your products. So push yourself to think of things that are differentiating higher quality and then how you can use AI with your data to also differentiate.
Melissa - 00:43:25: Amazing. I'm excited to see that unfold over time too. Thank you so much, Amir, for being here on the podcast. If people want to learn more about you, where can they go?
Amir - 00:43:33: LinkedIn, it's Amir Kabbara. Just search for me, I guess. And then you can find me @AmirKabbara on Twitter. Email me, amirkabbara@gmail. Not many people are named Amir Kabbara, so just type it in. And on your social networks, I'll pop up.
Melissa - 00:43:47: Okay. It'll be easy to find you. And if people want to learn about Paper, where can they go?
Amir - 00:43:50: So paper.co.
Melissa - 00:43:51: All right, and we'll put all those links in our show notes at productthinkingpodcast.com. Thank you again for listening to the Product Thinking Podcast. We'll be back next Wednesday with another amazing guest. And in the meantime, if you have any product management-related questions for me, please go to dearmelissa.com and let me know what they are. In the meantime, also like and subscribe so that you never miss a future episode. We'll see you next time.