Scaling Your Product Team with a Customer-Centric Approach
How do you know when your company is no longer in ‘startup’ mode? When does your team need product management more than project management?
Craig Sturgis, VP of Product at SmarterHQ, was a part of the company’s founding team, before he left, did his own thing, and came back to take the company from startup to scale up. It was at this inflection point where Craig realized one impactful thing – in order to grow the company and improve the product, the focus needed to shift.
In this episode, Christian and Anna discuss how each team in the organization from customer success to development can work together to best solve customers’ pain points. Craig also shares tactics on how to apply learnings cross-departmentally. Plus, you’ll learn what modern product management is and how to implement it into your organization.Listen Now
Craig Sturgis: We know we're not a 100% right, in fact sometimes we're really wrong, but we want to be less wrong more quickly.
Anna Eaglin: Better Product, the only show that takes a behind the scenes look into how digital products are created.
Christian Beck: The business is built around them, and how you too can innovate better product. I'm Christian.
Anna Eaglin: I'm Anna.
Christian Beck: Welcome to today's show. We're all familiar with the story of the prodigal son, right?
Anna Eaglin: Where are you going with this?
Christian Beck: Okay, hear me out on this. Craig Sturgis is the VP of product at SmarterHQ. When the company was in it's infancy, he was a team member.
Anna Eaglin: Is this the joke? Does the infant equal the son becomes that's lame and a major stretch.
Christian Beck: Hold on Anna, so he was part of the original team at SmarterHQ.
Anna Eaglin: Original is a better word or founding.
Christian Beck: He had left to start his own company, that company sold, and he found himself with the opportunity to rejoin the SmarterHQ team.
Craig Sturgis: I was going on a path towards starting my own company again when the opportunity came up from people that I used to work with, I liked working with saying, “Hey, we really need someone who has experience in product management to help us out.”
Anna Eaglin: As Christian shared, Craig was a part of the startup team, and then rejoined to help the company scale. This excited him and gave him a new challenge to take on.
Craig Sturgis: A lot of it was the ability to really develop the process to understand what our customers are trying to solve, how to best solve it, and then apply learnings that I've had elsewhere to really help a development team that was really more technically focused try to learn to be more customer focused.
Christian Beck: Before we get further into Craig's story, let's take a pause to tell our listeners why SmarterHQ exists.
Anna Eaglin: Great idea Christian.
Christian Beck: Oh, thanks Anna, now you like my ideas? SmarterHQ is?
Craig Sturgis: A personalization platform for consumer phasing marketers, so think big brands like Macy's, Bloomingdale's, Sam's Club, hotel chains like Hilton. We're helping marketing teams at those groups personalize their messaging in a way that's most relevant.
Christian Beck: Craig set out to solve specific challenges. Starting out, he had to dig into project management, not product management.
Craig Sturgis: Working with the team that was there to really put one foot in front of the other, and stay focused and aligned on just let's get all of our customers on the same tech platform which is what they had been struggling with for a couple of years.
Anna Eaglin: Once he got some initial obstacles out of the way, clearing of the forest if you will.
Christian Beck: Clever.
Anna Eaglin: He shifted the focus to what he refers to as modern product management. This is the initiative that gets Craig out of bed in the morning, let's hear why.
Craig Sturgis: A lot of what product management as I've experienced it, or at least as I'm familiar with it, and especially in B2B and enterprise focused companies is very much more project focused. It is very much, take the ... In the office space sense, take the requirements from the customers to the engineers and break them down, and more of a ... I don't even want to put it as agile versus waterfall, it's very much more task oriented. It is the business wants this, [inaudible 00:03:09] refers to it as more of an IT mindset versus a product mindset where it's like, “The business needs this, and we need to undertake these steps and build this machine, these sets of features so that we can make money.”
Craig Sturgis: Whereas the mindset that I'm more interested in is a more customer focused mindset, it is, we need to understand and meet the needs of our customers in a way that serve the needs of our business, so that involves much more understanding and much more direct connection to your customers, your buyers, your users.
Craig Sturgis: We are an enterprise focused company, so it isn't just about our users. Often times people making the buying decision about our product don't actually use our product at all, so you're having to think about what's going to be valuable to the people who are making the buying decision, who are putting their necks on the line to buy your product and get the return of value on it, and then how do you empower the users of your product to help in that?
Craig Sturgis: It's this whole complex system of people that you have to understand and serve their very different needs in a way that they feel the value both for their day-to-day, and for when they are taking it ... I often talk about what they then take and put on the PowerPoint presentation to their boss being the primary indicator of whether we continue being and serving them as customers, or whether they choose to use something else.
Anna Eaglin: In order to make your product team more modern again, it's that changing from task focus to customer focus, so what are the actual tactical things that you do to start that process?
Craig Sturgis: I work with a lot of really smart, really talented people, but a lot of the times I felt I was speaking a completely different language, or just the assumptions about what product management is, and what their previous experience, they just weren't familiar with it, so a lot of times it was, well, this team over here talks to the customers and they give us the feedback and then we go and we take that, and we make sense of it, and we turn it into a sprint backlog.
Craig Sturgis: A lot of times what I would try to do is just try different techniques. One of the ones that I started out of the gate, and have tried to maintain over time is this idea of adopting a customer, so let's bring in ... We have a client success team, and they are the people who primarily own the relationship with our customers, so let's bring them in and talk about the ongoing relationship with one of our customers. Let's talk about what we talked about on our last status call. Let's talk about what we talked at our last business review from a strategic perspective, what are they struggling with?
Craig Sturgis: We succeed when our customer succeeds, so what's the next step? What are they doing to continue to grow to continue to see success? How are they using our product to do that? How is our team using our product on behalf of our customer in order to help them do that? It's usually a mix because they are really complex systems, so how can I record a conversation or record an interview with the permission of our customer and play it back for the team or give it to them in a way that they can listen back to it and get some sense of direct connection to that customer?
Craig Sturgis: Everyone of us is going to get a different insight from a conversation or an interaction with a customer, so how can listen to the same conversation together and then discuss it to see, “What did you get from that?” Or, “You're going to bring a different perspective to this."
Christian Beck: Craig, this all really great tactics. I'd love to understand how you've applied this specifically on maybe any project or initiative at SmarterHQ.
Craig Sturgis: One of the things that comes to mind is our big focus right now, how do we uncover insights from all the data that we collect, activate and help automate messaging on? We're really good at that, we're helping you do really complex stuff across a bunch of different ways that you message with your customers.
Craig Sturgis: What we continually hear, either indirectly through our account team or directly from customers is like, “What does the data tell you that I can't get access to?” That's a really broad question, so how do we dig into the specifics of that? We've launched this ongoing effort that's ... We're really a cross-functional team that includes members of our client success group, our sales and solutions engineering group, myself, my team, our head of design, as well as some more technical folks and our data science folks.
Craig Sturgis: Let's have specific conversations with customers, let's recruit individual customers that have asked us these questions before, and let's really dig in and have conversations with them ongoing and align an ongoing cadence that's not just about, “Hey, here's what's on the roadmap, and here's what we're doing.” It's more of a discovery conversation where we're having broad customer interviews that are meant to be recorded and played back for the team so they can insights about them, but saying, “Okay, great, you asked me what can you see from the data?” Well, what would you like to see? What are questions that you can't answer by working with your analytics team today, or the team that's building reports for you?
Craig Sturgis: We have all this data that we can choose to bubble up and report on, but really what we're looking for, and we're looking to produce ad hoc, we're using our teams, doing things that don't scale to produce a report or insight saying, “Here's what we found, does that look good to you?”
Craig Sturgis: Playing it back, it's almost like a mix of prototyping via spreadsheet and report, and PowerPoint deck, that is trying to get at in this ongoing cadence of, “We can produce this insight, how is that working for you? What's useful about that for you? What else along these lines would you like to see?”
Craig Sturgis: We're using that to drive not only more understanding about how our customers measure success or the types of data points that are valuable to them, which is related but almost totally different than what our product has done traditionally, which is just, we drive messaging, it makes you money, it helps you build better relationships with customers, but it's also this whole other side in product opportunity for us that both reinforces the primary value of, “Hey, not only did I make this much revenue from this messaging campaign, but I'm seeing this level of engagement of people who receive this messages go up.” Or, they're really loving working with this product where we're trying to really do a deep dive and insight of something we've ... It's not our core business, we want to build products that scale up, that we can do this for everybody, but doing it in a way that in the Paul Graham sense of do things that don't scale in order to validate, “What's going to be really valuable across cross sections of our customers?”
Craig Sturgis: Building those relationships, having the raw conversations captured and recorded in ways that we can take back to the team, so that all the different perspectives, whether it's people who own the relationship, people who sell the new customers, or people who build the actual product can bring all the different insights and discuss together, “How are we going to make a better solution for our customers? Is that applicable broadly?"
Christian Beck: When you started this project you said you recruited customers that had been asking questions before in this ongoing fashion. Are you focusing industry by industry now or however you segment, and then looking to scale after that? Are you trying to do a cross section at the beginning of a research project like this?
Craig Sturgis: We do focus on specific industry verticals inside consumer ... Retail and hospitality are our two biggest ones, so what we look to do is try to identify five or six people in ways that we can manage that. Five or six customers, and five or six direct contacts that we have good relationships with that are both willing to spend the time with us without necessarily needing to go overboard to incentive them which probably would skew their response anyway.
Craig Sturgis: In general, we're going to produce as a result of this research something that's valuable to them in their day-to-day life. We're trying to find people both that will work with us, but that are also representative of our customer base and our target market.
Christian Beck: You mentioned a few times that you're doing ongoing research, why not just conduct a study, say over these next few weeks, we're going to conduct a study with this customer, we're going to talk to them, and then we're going to write up a report and it's going to drive our roadmap. What's the rationale behind creating ongoing research with these customers? How do you do that?
Craig Sturgis: One of the challenges that we have is that we don't have a giant sample of people to go do a big single research study against, but also our product development is ongoing. We're developing, we're shipping, we're building, we're learning in an ongoing way, so if we had the ability to continually get feedback as we're changing our product, that's going to be better feedback, better insight, better learning from what our customers are telling us over time.
Craig Sturgis: We're going to be wrong, in general, we work with smart people, they have really good ideas for solutions, but you'll never know until those solutions actually hit the market, and a customer tries to use them to get value. You're going to learn new things all the time, and have new things for them to react to, and that's going to only help you get better if you're continually following up, if you're continually asking for feedback, then asking the same questions, helping them react to things.
Anna Eaglin: You talk to customers, and then you mentioned going back to them. Tell me a little bit more about what that process looks like, and what those followups are.
Craig Sturgis: We have a regular scheduled time. We're trying to meet with them once a month, we don't always stay on that, but we're always working to get back in front of them. We build something, we put it together, and ideally we start with a spreadsheet or mock ups, and then eventually we're putting a real product in front of them.
Craig Sturgis: Even when we're doing just those static artifacts, we want to learn from what their feedback is and come back and show it to them again. In general, we know we're not 100% right, in fact sometimes we're really wrong, but we want to be less wrong more quickly.
Anna Eaglin: You mentioned utilizing both qualitative and quantitative data, you have a hypothesis now, now let's confirm it. Tell me a little bit more about that and what that part of that process plays in your research.
Craig Sturgis: Typically, when we're choosing to focus on an area of product development for a good long time. In general, I think this is a really good way to approach product management of any product, but it's ... We have this goal for our customer ... This customer has this goal, or we believe they have this problem. We're going to focus on all the different ways we can solve that problem, so let's write it down, let's say, “Here is what our hypothesis is about this. Here's the high level solutions worth thinking about, but here is how we're going to measure success.”
Craig Sturgis: That might be a qualitative measure, it might be a quantitative measure. In an ideal world, everything is quantitative, it's measurable, not everything is though. There are stories upon stories especially in B2B and enterprise context where they are a happy customer, then they are gone. What were the indicators you could have been looking for to understand that?
Craig Sturgis: We define our hypothesis, we work with all the different people on our cross-functional team, from all the different perspective to think about, “Okay, what are those problems? How can we firm those up? What are possible solutions that we could tease out and measure and then find out what's feasible and potentially go build a prototype? Go build a mock up, go actually build it for real and put it in front of customers. What can we iterate and then see in our analytics what are they doing? What we expect them to do.”
Craig Sturgis: I do think both of those things are really important, having enough checkpoints and building a pipeline and cadence of customer conversations to go along with when you're looking at your analytics reports. We use a tool called Pendo to help us understand how people are flowing through the product, but that's only one piece of the story because as I mentioned, our customers, the people who make the buying decision are almost never the people who are actually using the software.
Craig Sturgis: Looking at our product analytics is not going to tell us whether the person who makes the final decision and who is our champion inside a really big complex organization is getting what they need out of working with us.
Anna Eaglin: Does the buyer research always have to be qualitative then if they're not in your product?
Craig Sturgis: It mostly has to be. There are ways, and honestly with the initiative that we're talking about, one of our goals is to get our buyers into our product more because we're going to produce something that's going to be more valuable in their day-to-day lives. It's selfish in that we want to find ways to get them to login into RUI on our platform in ways that we can see quantitatively.
Craig Sturgis: Hopefully that's more scalable because there are only so many of us and scheduling busy people is hard, that's something that helps us get another window into the life of our buyer versus our day-to-day users.
Anna Eaglin: You said you wanted your product team, your developers to have a direct contact with customers, why is that important?
Craig Sturgis: Each of us has different perspectives, different insights, we have different tools, we all know something, come at it with a unique perspective. When you have a conversation together, when you play back an interview, ideally when you are in the room together with a customer or observing them trying to solve a problem that your product is meant to solve, the developer is going to get an entirely different perspective on it, and they are going to be able to tell something and then act on it in a way that I can't.
Craig Sturgis: If we're working on that and our client success director who owns the relationship is also watching the same interaction, we can discuss it afterwards and all come to a better common understanding than any of us could on our own. If we're trying to play telephone and say, “I just got off the phone with this customer and they said this.” That might be true, but you're missing out on maybe other context that they also said that the person who owns the relationship might not think to relay.
Craig Sturgis: I feel there's this philosophical divide in technology where people think, especially in software that the people building the product think to many people they are worker bees, that you give them a set of tasks and they just go and build, they are there to build the spec, but realistically the people who build the product are not worker bees, they are our problem solvers, they hunger and crave to be given a problem to solve, and they can't really solve it well unless they have some amount of direct contact and insight about the person they are solving the problem for.
Anna Eaglin: Do you think your experience as a developer moving into product management helps you have this perspective? How else have you come to this philosophical understanding?
Craig Sturgis: Sure, it's a really common thing, and there are other people of my tribe like the software developers who really want to solve problems, and solve problems for customers, there are many of us, so I do think that drives a lot of my perspective. One of the reasons I moved into a product role from an engineering role was I wanted to empower more of that, the really big insight that drove it was when I started my own company with my partners, I was really focused on ... I did the product role because there were only a few of us, and somebody had to do it, so I was stacking up and saying, "Okay, well what are the things we should do in which order?"
Craig Sturgis: I was really focused on process and efficiency, and shipping more product more quickly, but then what I learned over time is it doesn't matter how efficient you are at shipping the product if it's not the right product, so that's what really pushed me fully over the edge into, “Wow! How do I solve these sets of problems so that the people who are really ... Honestly way better than me at writing code can be more effective at doing it, and we can win together.”
Christian Beck: Craig, put you on the spot. You've been doing this for a year and a half, you've implemented a lot of these modern product changes, how do you know whether anything is successful or not?
Craig Sturgis: Are our customers successful? Are they more successful than they were before? It's not dramatically different, we're still paying down a lot of the cost upfront to really realize the benefit, but our odds are much better of realizing that than they would be if we were not putting in the work and grinding through a lot of this stuff that some people don't feel is necessary, but I do think our odds are much better than if we weren't doing it.
Christian Beck: It seems what you're implementing with this modern product management approach is not something that you just implement for quick wins, it seems it's more foundational, and building up over time, would you agree with that? Am I characterizing that right?
Craig Sturgis: Yeah, it's a lot of quick wins on the way to big wins. You do have to be satisfied with small wins at first, but then it basically makes it more likely you're going to get the big win later.
Anna Eaglin: What does better product mean to you?
Craig Sturgis: Better product means you deliver more value for customers in a way that helps you build a viable sustainable growing business.
Christian Beck: Craig gave us a lot of great details around the difference between old school or old fashioned product management and new product management, but the one thing that I would love to focus with you on is his description on why they do ongoing research rather than these big research studies.
Anna Eaglin: Can the modern PM have it all?
Christian Beck: Find out, right now. First off, what do we mean by ongoing research? You've talked about this a lot even in our own work, and you've used the word ... Shoot, what is the word-
Anna Eaglin: Progressive synthesis.
Christian Beck: Progressive synthesis and ongoing research, what does that mean?
Anna Eaglin: It's our agile approach to research, and it's similar to what Craig talked about. What he said is that they have a monthly cadence of customers, they find their friendly, their customers that are finding success, and they just talk with them once a month, so his rationale for doing that is that product development, it's continuing to move, and that they always have questions to be answered, they always want to be checking in. We do that a lot too, again, we consider ourselves an agile agency, and I feel user research a lot of times is one of the last true waterfall processes really.
Anna Eaglin: It's one of those things where you line up your users, you do a big study, you learn from them, you have output, you share their findings, and then you make decisions based on it, and we prefer to be much more agile in that way. Again, what Craig said, there is value in those big, you want to go really deep, you want a lot of people, you want to learn all about a new market. Taking the time to do that can be really valuable, but if you're in an agile product process, you want to get those insights and act on them and keep going.
Anna Eaglin: You're acting on these things, but you're also gathering too, you're gathering ... We call it progressive synthesis because we look for themes and we build on those themes as we continue and talk to more people. We'll start to see a trend formed, and if other people say it, we'll build on the trend or maybe the trend will die, so we just go with what we're hearing and try to make decisions as we do.
Christian Beck: Yeah, that reminds me of that great quote that he had of trying to be less wrong more quickly.
Anna Eaglin: Exactly.
Christian Beck: This is a personal battle I've had with my wife who is a PhD candidate in anthropology, and she looked at the way that we conducted research, it was like, “That's like discount research.” I was like, “Yeah it kind of is.” The way that he put it, that's why our research methods are slightly different when it's not for science because it's really just like, you're not actually trying to conduct a research to figure out, “Okay, what is the one true answer here?"
Anna Eaglin: Exactly.
Christian Beck: We're building product, you're just trying to reduce uncertainty.
Anna Eaglin: Right, we always say, and this would really ... This is the philosophy that at least I was exposed to via Erik Stolterman's book, The Design Way. He talks about the difference between what is true and what is real. True is something that's true in all environments, gravity is always true, unless I go to the moon, I'm basically going to experience the same way everywhere, and that's not the type of research we're doing, we want to know what's real to people, and real can change, but what we want to do is capture what's real right now, and that's why we don't do a set number of participants, and that's why we don't have variables. I don't even know the science words to-
Christian Beck: We don't even know the science words.
Anna Eaglin: I don't even know the science word. We do research to the point where it can inform the product, and we learn what we need to learn, and then we just continue on.
Christian Beck: Yeah, and maybe that act of doing it ongoing reduces the confidence you put into it, which sounds bad, but it's more like, “We're not doing this research to get rid of all the fears or the anxieties that we've got it all right.” It's like, “We're accepting the fact that things do change, and we're just going to chip away at it.” We're not trying to do this to come ... To what you said, to come out with the truth, we're just doing it on an ongoing basis.
Christian Beck: Another aspect he mentioned when we asked him why they do ongoing research is, they don't have a giant sample size. Even if you're not starting something from scratch right now, you may face similar issues as Craig which is that you don't have a really diverse customer base. You may have 15 customers that have 40 users each, but they are not really diverse, so if you're going to do a real study, that might not be a big sample size.
Anna Eaglin: When we build product, again, it's not that we want to be a 100% correct because you can't be. Again, you're bringing people a vision that they may even know that they want, but you want to lessen what you don't know. Again, that does go to less wrong more quickly. You get some information, you have a hypothesis, you test the hypothesis, and you find out if it worked or not.
Anna Eaglin: That's also too where Craig mentioned this a bit, mixing your qualitative and your quantitative data. Here they are using ... It sounds the qualitative information they are getting from their customers in these once a month recurring interviews, but then also watching their quantitative analytics, and then maybe looking for cross sections there.
Christian Beck: That also reminds me of something he said where he was talking about without the qualitative aspect, you can just see that they seem like a happy customer, and then one day they are gone, and they don't know why. If they're not also supplementing with the analytics ... The analytics in the product might not hint at them leaving you, it might tell you where there's opportunities, but it may not hint at that, but if he supplements that with ongoing conversations and understands how those two things are merging he can get a much better picture of where there's opportunities for unmet needs.
Anna Eaglin: We shared an article around here that was done by the Spotify research team, and they have data scientists on their research team. They use their qualitative information to find really interesting hypotheses and they'll test it with the quantitative side. Or they'll use the quantitative side to say, "Hey, we're seeing something that's weird, let's do some interviews to follow up with it."
Anna Eaglin: Really using those complimentary methods, and again, maybe that's why we're pulling more from the science field and we're trying to triangulate.
Christian Beck: What you're saying is right, and the idea of triangulation is important here. There is not just one source of data, it's also ... You're bringing up another point that's come up a few times on this podcast of being data informed. We've seen this theme pop up in a few ways, customer informed versus customer driven, or data informed versus data led decision making, maybe Jeremy Leventhal from Springbuk will talk a little bit about it, but the idea being that even ... Like you said, having data science on your team, or even just looking at the data, the answer is not just self evident, you don't just look at the data and be like, “Okay, I clearly know what we need to with their product.”
Christian Beck: I look at Pendo, I look at these, you do have to triangulate, you have to be talking to customers, you've got to be working with your customer support team, understanding what your team is building to really form better hypothesis and doing it on an ongoing basis rather than just saying, “This is the answer, the data tells me that. It's like the data is hinting at something, we've got to fill in the blanks with other sources."
Anna Eaglin: I want to caveat out this too to say, it's not just a matter of doing research, you have to be somewhat thoughtful in your methodology and your approach to the research. It's not just a matter of having data, you have to have the right data, it's asking the right questions too. It's not just a 10 minute call to users where you ask them how much they like these features, it's beyond ... That's a great first step, get out there, get talking to people, but there's correct methodologies to use as well.
Christian Beck: You're saying that you don't just pick up the phone and say, “How much do you love this feature? A little bit or a lot?” When we think about ongoing research, we've talked a little bit about conducting it in what you're doing afterwards, now you're saying, “Hold on, let's also make sure in the beginning that we're prepping it right.” Do you think that you have to be revisiting the way that you prepare research on an ongoing basis or can you ... You have a methodology and say, “Look, these are the types of questions we're going to ask.” Do those need to change as well or can you keep asking those same things and be changing the people that you're asking them of?
Anna Eaglin: There's a really great article written by Erika Hall, who is a really smart researcher, I believe she works for Mule Design. She has a great article called, Research questions are not interview questions. Basically it breaks down the difference between what is a research question versus and interview question. A research question is something you're set out to answer, and there's a couple of aspects of a research question. It has to be something you can actually answer. It has to be something that fits what you want to learn more about.
Anna Eaglin: It's a good way to think about how to get at what you want with people without just, “How much do you like feature X?"
Christian Beck: Right, nothing and so you get feedback. The recap in terms of the ongoing research aspect, it's a great methodology to use, pretty much anytime in product, there's no-
Anna Eaglin: Definitely, the more contact with your customer is the better.
Christian Beck: Right, and it's good for if you don't have a great giant sample, and even if you're doing it, it helps you be ... To steal Craig's quote again, to be less wrong more quickly.
Anna Eaglin: Thanks so much for listening to the show this week, if you haven't yet be sure to subscribe, rate and review this podcast. Until then, visit Innovatemap.com/podcast and subscribe to learn how you can take your product to the next level. As always we're curious, what does better product mean to you? Hit us up on Twitter at Innovatemap, or shoot us an email at email@example.com.
Christian Beck: I'm Christian.
Anna Eaglin: I'm Anna, and you've been listening to Better Product.
Christian Beck: Better Product.
Anna Eaglin: Drop mic.