Skip to Content

15 / Test Assumptions to Achieve Product-Market Fit

Hosted by Sean Flaherty

Listen

About

Dan Olsen

Olsen Solutions

Dan Olsen is an entrepreneur, consultant, author, speaker, and expert in product management and Lean Startup. At Olsen Solutions, he works with CEOs and product leaders to help them build great products and strong product teams, often as interim VP of Product.

Dan has worked with a range of businesses, from small, early-stage startups to large public companies, on a wide variety of web and mobile products. His clients include Facebook, Box, MicrosoftYouSendIt (now HighTail), Epocrates, Medallia, XING, Financial Engines, and One Medical Group.

Prior to consulting, Dan worked at Intuit, where he led the Quicken product team to record sales and profit. Dan began his career designing nuclear-powered submarines in the United States Navy. 

Dan earned a BS in electrical engineering from Northwestern University and an MBA from Stanford University. He also earned a master’s degree in industrial engineering from Virginia Tech, where he studied the Lean manufacturing principles that inspired Lean Startup. 

Dan wrote the bestseller The Lean Product Playbook. He lives in Silicon Valley, where he hosts the monthly Lean Product & Lean UX Meetup. Dan enjoys sharing ideas and comparing notes with as many people as he can.  

Software product development is hard enough. It’s harder still when our investment of resources is based on a set of untested assumptions. The probability that we perfectly address each of the hundreds or thousands (millions?) of assumptions, hypotheses, and decisions is super low. Once we get comfortable with the idea that many of our assumptions are wrong, we can embrace the uncertainty and engage the anxiety that comes from it, says Dan Olsen.

In this episode, Sean and Joe chat with Dan Olsen, Silicon Valley-based consultant, author, speaker, and proponent of the Lean Startup approach to software product development. He also hosts the Lean Product & Lean UX Meetup – a monthly gathering of nearly 8,000 members who come together to learn from industry experts and one another about product management, UX design, Lean Startup, growth hacking, and Agile development principles. Dan reminds us that the surest way to eliminate anxiety is to confront its causes. Articulate your hypotheses and test them. Whatever the outcome, the evidence you gather from user testing will boost your confidence and increase product-market fit as your anxiety fades. Dan’s unique insights on product-market fit – a perspective that serves as a melting pot where all the best ideas come together – are sure to be useful to you no matter your role or the product you work on.

Read the full blog post here.

Joe [00:01:45] Well Sean, today we have a good friend of ours, Dan Olsen, and we’ve been coming to know Dan through his work and meeting him and understanding what he does to work with companies and product teams and help them build better products. And so Dan Olsen, hello! How are you?

Dan [00:02:04] I’m great Joe and Sean, it’s great to be here with you all today.

Joe [00:02:07] Excellent. So would you mind introducing yourself and just telling us a little bit about what you do, what’s your role, what are you doing with people in this world?

Dan [00:02:16] Yeah, no I, I do several things. So all of them though relate to product management and building great products. So what I largely do these days is speak at conferences and events- it was a pleasure to speak at your Product Momentum event, that was a lot of fun, and more generally just provide training. So I do a lot of public and private training workshops to train product teams on kind of modern practices on product management and product development. I also coach product leaders and teams and I still do some hands-on consulting, which is a lot of fun, applying some of the concepts that we’re gonna be talking about today, like actually defining MVP’s, actually doing the wireframes myself, writing the screeners and interviewing users. And I also wrote the book The Lean Product Playbook which kind of captures a lot of the advice and my perspective on how I think about building great products. So in addition to that I run a meetup; I live in Silicon Valley and I just love bringing together fellow leaders and having a community of people here, so for five years now I’ve been hosting a monthly speaker series called Lean Product. And I think that’s most of what I do.

Joe [00:03:20] That’s all?

Dan [00:03:21] Yeah. Well also, actually, there is one more thing. Once a year I helped co-organize a product leader summit which is coming up on the 19th of September.

Sean [00:03:29] Uh-huh.

Joe [00:03:29] All right.

Sean [00:03:31] How long have you been doing that?

Dan [00:03:32] This is our fourth Product Leader Summit.

Sean [00:03:34] Fantastic.

Dan [00:03:35] It’s a lot of fun. Yeah, luckily we have a team of people because it’s a lot of work, as you know, to put an event, so yeah.

Sean [00:03:40] As we know…

Joe [00:03:42] Cool. Good luck with that.

Sean [00:03:43] So when we had a dinner a few weeks back I remember you telling me that you worked in submarine design for the Navy.

Dan [00:03:49] That’s right.

Sean [00:03:50] So first of all, publicly, thank you for your service.

Dan [00:03:53] Yeah, thanks.

Sean [00:03:54] That’s a big deal. But I’m interested to start off our conversation with this question: how do you equate that experience to the development of software products?

Dan [00:04:03] Yeah it’s funny because I, uh, we talked about it and it was the coolest job right out of college. So I was an electrical engineering major during college. I was Navy ROTC so they had paid for my scholarship so I had to find, you know, the right part of the Navy that I thought was gonna be the best. I interviewed naval reactors and I was lucky to get there. So I basically, the way I describe it is like NASA for submarines. And so my title was technically like engineer and then lead engineer, so it was a very technical job. They sent us to six months of training in like nuclear and mechanical engineering and you had to have a tech background… All this stuff.

Dan [00:04:35] Then when my five years were over, I went to business school. That’s what brought me a Silicon Valley and that’s where I learned about product management and I was like, “that’s what I want to do, you get to like work on product but not actually build it, interact with customers, think about the business.” And then when I got, I was lucky to get my first flight management job at Intuit, that’s when I, as I started doing the job I’m like, “oh this isn’t too different from what I was doing on the submarines.” Which the commonalities I think are one, we’re working on a complex product, you know, a submarines, the Quicken application that I was working on had been around for a long time, had tons of functionality, was relatively complex. Maybe not quite as complex as a submarine, but still very complex. And because of that complexity, it requires, you know, a large group of cross-functional skills to build and execute that product just like we did at Naval Reactors. And so at Naval Reactors we divided up the responsibilities along certain lines and you would basically have to get alignment, or agreement, we actually called it concurrence, on anything you recommended. So if there was a certain system in the submarine that I said, “yep, this looks like a good design,” I’d have to go get official signoff from these other people. So I was totally used to this idea of, “hey we’re developing this product in a cross-functional team manner and we’ve got to get buy in from everybody and sign off that these requirements are good and this design meets the requirements.” You know, so obviously it wasn’t software and it wasn’t in the commercial sector, but those aspects of kind of building a complex product cross functionally were very very similar and so it felt very natural to me even though the domain was different.

Joe [00:06:03] Very cool, so that’s a great little segway there. So we always have a focus for every episode and what we want to talk about with you was kind of like user testing, user research, and specifically, how can you do as much of that as possible potentially before you build anything in building a new product or maybe doing as much as you can as you build a very smaller minimal viable product as they say, you know, that type of product. So can you tell us a little bit about your philosophy with all of that and maybe set that up for the rest of the conversation?

Dan [00:06:33] Yeah definitely. I mean I think, I think, you know, whenever you’re building a product, you either explicitly or implicitly are making tons of assumptions and hypotheses and there’s just a million different decisions that you have to make whether you realize or not. And so if you just rush in and build a product, you know, there’s a lot of uncertainty and risk in all those assumptions and the probability that you got all of those like just perfect when you build it is just super super low, right. If you take a scenario, like take two scenarios: scenario one, we envision a product, we build it, we launch it, and then we realize all the things that we didn’t get right, versus scenario two where we envision the product, we come up with a prototype, we test it, we figure out what’s not right, we fix those things then we build it and launch it.

Dan [00:07:13] You know, obviously the second way is a lot smarter, it’s less risky, it’s higher ROI. The one thing that comes up is, “well but it’s not as fast because you’ve got to test.” But I would argue in the grand scheme of things it’s actually faster because building is expensive, it’s slower, and it’s harder to change. It’s funny because with a lot of my larger clients, tech debt is a big thing that comes up. Tech debt starts accruing, like the second you write your first line of code, that silent tech debt is building up you don’t realize you’re locking yourself in with certain things, right. And the way that plays out is if you build it and then you slowly realize, “wow we didn’t get the sign-up we wanted, maybe we should go talk to…” And you start to realize, “wow this isn’t quite right.” And you go to the engineering team to change stuff, then they’re like, “well we’ve already built the database this way, we’ve got these API calls this way…” It’s just a lot harder to change, right.

Dan [00:08:00] So I just think, that’s my general idea is to like, you know, articulate your hypotheses. Ideally, try to figure out which ones are the riskiest, you know, and that’s what we want to test so that we’re being more mindful and spending our resources wisely. You know, that’s what the essence of Lean is all about is smart use of resources. It’s not about being skimpy or cheap or cutting corners. It’s about being very mindful of your resources and the way I like to generalize it is, you should invest your dev resources, which are usually your scarcest resource, in proportion to your level of confidence that you’ve got, right, and that confidence comes from evidence that we’ve got from user testing, either qualitative or quantitative data that gives us evidence, that gives us confidence, that should inform how we invest our resources basically.

Joe [00:08:44] Yeah I mean it just makes so much sense when you say it out loud and you just talk about the math of it, like your assumptions are probably wrong. So, you know, minimum viable product is one of these terms that is both overused and super ambiguous because no one seems to have a single definition. What’s your definition of it?

Dan [00:08:59] Yeah, no I think, and it’s funny because, you know, any time you have a, it’s a buzzword, right. Anytime you have a big… And actually, you know, it existed before Lean Startup but it became much more popular with Lean Startup. But I mean, anytime you have one of these buzzwords it’s potentially going to be interpreted differently and misunderstood by different people. MVP is the poster child for that. You can go online and you can find quotes that say, “hey, a landing page is an MVP,” you know, and then, I remember seeing this and then in all the comments people were like flaming the person and saying, “no way, are you crazy? It’s not an MVP.” And you know it’s bad because a lot of companies, they’re like, “well we actually believe in a minimum lovable product, MLP. We believe in minimum viable feature or minimum sellable product.” So they’re coming up these alternatives to basically have it match what their expectations or definitions are.

Dan [00:09:48] So I think it is one of these terms that’s not well understood. For me, the crux of the difference of opinion, and I do this in my workshops a bit, I’ll be like, “who thinks a landing page is a MVP?” And a third to a half the room will raise their hands. “Who’s like, heck no, there’s no way that’s an MVP.” Then the other balance of the room raises their hands. Is that the key is what trips people up, that product. The word product, and so the hardcore people are like, “how can a landing page be a product? It’s not a product. I could just put whatever up whether or not I can build it.” You know what I mean? And so basically the landing page people, they’ll go, “well, you know, you’re learning from it; you’re getting some learning, you’re testing your assumptions and your hypotheses so that’s valuable.”.

Dan [00:10:27] So the way I get out of that trap is just elevate it up one level and call them all MVP tests basically, right. So we call it a test and so an MVP test is basically anything that you’re learning about to either validate or invalidate one of your hypotheses. And then MVP without the word test we can reserve for the things that are kind of prototypes of your product; they’re actually testing the product experience and the product concept. So that would be things like, obviously live product is a true MVP. Interactive prototypes are a true MVP, you know, Wizard of Oz, concierge type MVPs. That’s what I reserve for true MVPs where you’re actually testing the hypotheses about the actual product concept itself. So both are useful, you know, both are useful but they each have their different role and in the book I have a framework dividing up the true MVP versus the MVP test and like product versus marketing and quant and qual basically as a way to divide all the different ways you can test hypotheses up.

Joe [00:11:22] Yeah that’s a good point you make about people getting stuck on the P. Like when we talk to clients they’re always wary to make it too small, and you know, it’s gonna provide enough value and all these other things. And there’s that quote by Reid Hoffman who founded LinkedIn of, you know, if you’re not embarrassed by the first version of your product you’ve launched you late. What’s your take on that?

Dan [00:11:40] Yeah I think it’s a good quote. I think people use it, like a lot of quotes, people use it for their own agenda, right.

Joe [00:11:45] Oh, for sure.

Dan [00:11:45] Or to further their own devices, right. I think the intent of it was, and some people take it literally, the intent of it was at the time and still today, people have this idea, kind of a waterfall mindset of like, “well I’ve got to get this thing perfect before I launch and the more I work on it, the more features I add, you know, the better it’s going to be.” So that’s kind of the mindset, right. And I’ve been there. I mean, you know, I’ve been in that boat. Every time you launch your product, that’s one of the toughest decisions to make is, is it good enough or not? And we can break that down according to my product/market fit pyramid. We can start out with a simplest thing of functionality. Does it have the right sort of functionality? Is missing some key piece of functionality? That’s the top debate that people just get wrapped around and, you know, someone’s like, “oh my gosh I can’t believe we’re launching without feature X.” And so then what happens? You put feature X in it and someone else, some key stakeholder or key customer says, “oh I can’t believe it won’t have feature Y.” And usually what happens is it’s not actually the end customer; it’s people within the company who are worried about the end customer and trying to act on their behalf and you just get the slippery slope where the next thing you know it’s like the kitchen sink and it’s no longer an MVP and your time to market has been pushed out, right. So this is one of the toughest decisions that teams need to make is, what really needs to be in that MVP? So I take his quote as an antidote to the bias of people to try to make their MVP too perfect shat’s why he said it the way that he said and it is one of the toughest judgments out there in the product world.

Sean [00:13:10] Yeah. Perfection is the enemy of learning.

Dan [00:13:14] Yeah, yeah.

Sean [00:13:15] If you wait to get the perfect product out you have no opportunity to learn and the longer you go between actual learnings, right.

Dan [00:13:22] Yeah. And I’ve actually been there. So like early, you know, when I was it Intuit, we were working on connected software and we would write, you know, I walked into a product management machine and we would write these thick MRDs, like market requirement documents, right. As a young eager beaver PM you’d spend all this time trying to write out, write up, and “what about this? What about this?” And at the end of the day a lot of people don’t read that thing. And the idea is, I think the analogy I think of is like an archery bull’s eye, like a target, right. And the metaphor is, “well if I just spend a little more time and think about it some more and write it up more I’ll get closer close to that bull’s eye.” And the reality is you don’t even know what you don’t know yet. So it’s an illusion that if you just spend more time getting it closer to perfect… I don’t think anybody can launch a perfect product because there’s no way to know all the things you don’t know until you get it in the market. So these techniques of testing before you build and not to get it perfect is just to eliminate the big obvious risks and uncertainties to get it closer to the proverbial, you know, bull’s eye. But you don’t even know. It’s kind of like the Matrix, like there is no spoons, like there is no target. You don’t even know what you’re shooting for. You may have your target customer wrong, right. There’s so many things you could be off on. Until you get it to market, you know, you don’t really know.

Sean [00:14:29] Right. I struggle with MVP a little bit when we haven’t first defined the minimum viable audience.

Dan [00:14:36] Right, exactly.

Sean [00:14:36] Like I think you have to start with your MVA before you can come up with a good MVP for that MVA or otherwise you’re building for whom, right?

Dan [00:14:45] No totally, that’s why the foundation of the product/market fit pyramid is target customer. You got to define, who is this for? And sometimes in work once someone will be like, “why don’t we just start with customer needs and problems?” which is the second layer. I’m like, “well, you know, just solving this problem in a vacuum doesn’t make as much sense as like, you know, each target segment is going to have distinct needs and distinct preferences.” And so I agree with you. You should be rooted in a distinct target customer audience, and even MVP, right.

Dan [00:15:09] So you’ve got several steps to go through. First step: who’s my target customer or audience? Second step: what do I think their needs are and then a pass on “what do I think the underserved needs are, you know, which ones are under met, under delivered on?” Then you get up to a value proposition, which is, “how are we going to meet those needs in a way that’s better or different than the competition?” And then you get to feature set and that’s where MVP comes in. MVP is saying, “okay, given all that pre work and hypotheses, what’s the feature set or functionality that we need?” So I agree that you should have a handle on all three of those things, like who your target customer audiences is, what are their underserved needs that you’re going to deliver, and how are you going to meet those needs in a way that’s better than the competition?

Sean [00:15:47] Right and then once you achieve that product/market fit with that viable audience you can then decide, “do I continue to expand on the feature set or do I go for a larger audience.”

Dan [00:15:57] Yeah that’s right. The other quote I love all the time is “don’t let the perfect be the enemy of the good” or “don’t let the great be the enemy to good.” I use it all the time with people because when I see product teams trying to perfect their product and they’ve gone past the point in my mind that’s optimal.

Dan [00:16:12] Yeah. And then once you’ve launched then you can say, “alright, cool, then we need…” You’re gonna find stuff you didn’t know. The question is just how many issues and how, what’s the magnitude of those issues? Once you work those through you can say, “should we add additional value add functionality for this segment or should we identify an adjacent segment that we can grow into?

Joe [00:16:29] So do you think it’s possible to launch an MVP for multiple personas or multiple audiences at once or do you recommend just kind of going at one and picking one to hone in on?

Dan [00:16:38] Well it’s tough. I think that personas are a great way to get clarity on who our target customer is and get alignment on the team. And again, especially your V1 persona, it’s just a set of assumptions or hypotheses, right, and you need to think about the market opportunity you’re going after and they may be multiple personas, right. So for example, like let’s take Uber or Lyft. You’ve got riders and you’ve got drivers. So it’s like at the end of the day you’re gonna have to create enough value for both because if you create this awesome app for drivers but there’s no riders it’s not going to work and vice versa right. I love to rank order prioritizing things. I call it ruthless polarization. So I’m always happy to say, “who is the number one that we need to do?” And I think too many times people have too many personas. Sometimes they go into a client and they’re like, “yep, our UX team or market team created personas, here’s our 10 personas,” you know, and then it’s like, “OK that’s great, but which one are we really focusing on?” And a lot of time there’s not the rigour of thought which is, do we look across the 10 to find out what’s common across them and what’s distinct across them and what’s the relative importance of each of these segments to our business? And if you do that you can help prioritize.

Dan [00:17:43] So, you know, in the case of Uber perhaps, if I had to, I would do drivers first because you kind of need to do that, but then I would quickly follow with, “what are we doing for riders,” or, you know, it be easy to cop out and say “let’s do both in parallel,” right. But then at the end of day, again I like to have a ruthless number one. Or Airbnb with guests and hosts, same kind of thing. In more of a b to b context, you have the end user and then you have the economic buyer usually as two distinct personas. Sometimes you might also have an ad menus case, right. Again there is not always the right answer but I would generally try to start with the end user, make sure we’re meeting their needs, and then also, unless the buyer was incredibly incredibly, you know, was the dominating persona that you could focus on the buyer needs. But the end the day you’re going to have to get the end user experience right. Even if you get them to purchase it, to get them to renew you’ve got to get the end user experience right. So I do like to focus on one. It’s OK to have an opinion about what the other ones might be. But I do like to focus on one at the beginning, if you can.

Joe [00:18:39] And I think, you know, the traditional thought about MVP is, you know, just going back years and years is, you know, “just get it out there don’t worry about how it looks just make it work,” and you’ve got another pyramid that I think is great because It’s just so simple and people just look at it and be like, “are we doing this of…” You know, an MVP has gotta be a little bit functional, reliable, useful, and delightful.

Dan [00:18:59] Yeah and that pyramid I adapted from someone who adapted it from Aaron Walters. So Aaron Walters was the head of UX design for MailChimp, and I talk about this in the book. I remember the first time I used MailChimp I actually just kind of like smiled and maybe even laughed out loud because it was such a fun user experience. The tone was funny, the UX was great, the visual design was great, and they had a couple of delighter things in there you just didn’t expect. So when I think of delight I think of Aaron Walters and what he did with MailChimp and so not surprisingly he’s the one that created this framework that the whole point of it, like you said, the bottom of the pyramid is function- and I modified it so these are my words now. Functional at the bottom. Next layer up is reliable. Next layer up is useful, or actually usable I think is my word, and the next level up is delightful. And the whole point, in of my mind, my take on the whole point of this pyramid: one main point is to elevate the discussion because, you know, there are still some people that aren’t quite focused on usability but for the most part that ship left like 15 years ago everybody knows even if you’re a b to b product it’s gotta be usable it’s not going to work out.

Dan [00:20:02] The whole point of this framework in my mind was to elevate the discussion beyond that and talk about delight. So if usability answers the question can users use your product, delight answers the questions, “do they want to use your product? “How do they feel when they’re using your product?”, right? And so that’s one point of it. The other application of it was, as you said Joe, was most people misuse MVP. They don’t understand it and they say, “OK OK we’re not going to launch all the functionality, I got that it needs to be a subset,” but they launch and they only launch a set of functionality and they ignore reliability, they ignore usability, they ignore delight. They say, “oh, you know, it’s okay if it’s buggy. It’s an MVP, we’ll fix that later. It’s OK, we’ll do the UX later; it’s just an MVP.” And then what happens when you test that, if all you’ve got is a subset of functionality that may or may not even be right but poor UX, poor, reliability, and no delight, there’s no way that MVP is going to test well. You know. So then you test it and then it’s going to bomb and crash and burn and then those people, the same people that misused and didn’t understand how to do an MVP will turn around and say, “see, this whole Lean Agile stuff, it’s baloney, it doesn’t work, let’s go back to doing a waterfall.” Right, that’s what happens. That general idea of kind of blaming the process or the tool after kind of a sub-optimal implementation or application of it happens a lot.

Dan [00:21:16] Anyway, so basically, and I’m not trying to say it’s going to be perfect, and this kind of in a way goes against that Reid Hoffman advice, right. So obviously if we could not be completely embarrassed that’d be great, right. Embarrassed, like what does that mean? Embarrassed by missing functionality, embarrassed by bugs, embarrassed by poor UX, right? Embarrassed by lack of delight. You know, it’s OK to be embarrassed on some of that stuff but we got to get enough of it right, I would argue before you build, right, because otherwise you’re wasting those resources. So back to this idea of, well you know, people run into this trap of a slippery slope of, “oh well Customer X is going to complain if we don’t put future A in there.” One of my favorite things to do” if we just say, “OK we’re worried about that too let’s just put the feature in,” you’re not really testing anything. You’re biting the bullet, you’re writing the check either way, right?

Joe [00:22:04] Yup.

Dan [00:22:04] And so what I love to do, this is where especially low fidelity can come into play. Early early in your process you say, “you know what, it sounds like you have a hypothesis that if the MVP doesn’t include feature A, Customer X will be really upset. Did I capture your hypothesis properly?” And then, you know, then they kind of look at you like, “are you Spock? Sure, yeah, that’s my hypothesis, yeah right.” Then it’s like, “great, like how might we test that hypothesis?” Because if we just put the feature in there we’re not testing a hypothesis, right? So the main way to test the hypotheses is actually to show them a wireframe or prototype without the feature and see if any, you know, if anybody cries uncle and says, “What the heck’s going on here?” right. And people are surprised time and again that they don’t even notice it. When you do an MVP if there’s extra features in there customers don’t go, “Hey why did you put this extra feature X in there, that doesn’t make any sense, I would never use it.” They don’t bother, they don’t care, it doesn’t bother them. It’s no skin off their back, they don’t need to comment on that. But if you’re missing a key feature they will certainly scream and say, “Hey where’s my Salesforce? I don’t see Salesforce integration here, there is no way I could do this about Salesforce integration,” or whatever it is, right? So that’s why I like to do it early in the process, even in wire frames. Wire frames are a great way to test just the absence of functionality. Did you get the right feature set? Did you get the right overall kind of information architecture and UX design?

Joe [00:23:19] Cool, so you mentioned doing user testing and we’ll definitely talk about that, but I love asking guests about these buzzwords like we talked about earlier. So product/market fit…

Dan [00:23:27] Yes.

Joe [00:23:27] What does that mean to you and how do you know like when you’re approaching it or if you’ve met it and like, do you pivot, what do you do?

Dan [00:23:33] Yeah. So that’s another buzzword, right. It was actually coined by Marc Andreessen back in 2007 and then became popular within Lean Startup. And, you know, I think, unlike MVP, product/market fit people talk about it pretty simplistically. At least they used to before I wrote my book. That’s part of why I wrote the book is like they’d be like, “oh you know Box, yeah Box succeeded because they had product/market fit.” “Oh sadly startup X, startup X failed because they did not have product/market fit.” They just talked about like this big true-false condition of either you were gangbuster successful or you weren’t, and that’s not really that helpful of a definition. We already have a word called profitable, so this must mean something else besides profitable, right? So for me the essence of what it means is that customers basically are agreeing that your product is providing more value to them than the other alternatives that are out there. That’s the essence of product/market fit.

Dan [00:24:21] And that’s why in the product/market fit pyramid, you know, I guide people through the hypotheses to try to ensure that’s the case and the way I think about it is there are five key hypotheses you get right enough in order to achieve product/market fit. You need to get your target customer right enough, you need to get the underserved needs that they have right enough, you need to get your value prop which is how we’re going to meet those needs in a way that’s better than the competition right enough, you need to get your feature set which we’ve been talking about right enough, and you need to get UX design right enough. They don’t have to be perfect. It’s like a big five term multiplication or logic and, right. And if any one of those is off it can get in the way of product/market fit. And the core of product/market fit; the next question is, how do you know if you have it?

Dan [00:25:00] Once you’ve launched your product, right, the way you know you have it, I like to describe as like, you know, imagine we have the world’s slickest marketing person or salesperson so they create the world’s best landing page that has a hundred percent conversion, or the world’s best salesperson that always is a hundred percent win rate. So we’re able to get people in our door. They get in the door, they kick the tires on our product, but then they realize it didn’t meet their needs, right. They’re not going to come back and use it again if it doesn’t meet their needs in a way that delivers more value than other products. Conversely if we have the world’s worst salesperson or the world’s worst landing page, but somehow a customer gets in there, if they get in there and kick the tires and realize, “wow this thing actually is useful and valuable,” they’re probably going to come back and use it again. So that core essence of knowing you have product/market fit is that repeat usage, right. That’s the core essence of knowing your have product/market fit is that repeat usage. People, they checked out your product, they tried it out, and then after using it they came back and used it again and again and again because it’s delivering value for them, basically.

Dan [00:25:57] So once we’ve launched the way we can check that is through retention rate. Retention rate is the formal way to track that repeat usage over time. And so the question is, “well pre launch what can we do?” And that’s what these techniques are, they’re fuzzier because we don’t have thousands of data points and things like that, but we actually need that qualitative depth and we need to get our fingers dirty and really talk to people to understand and to refine those hypotheses and revise them. And so the way that works is really more just you’re doing wave after wave of user tests, getting user feedback, figuring out what’s wrong, what hypotheses did we get wrong, what features are missing, what UX did we get wrong, you know, what messaging or positioning did we get wrong? And iterating and trying to pattern match and addressing those, coming up with the new iteration of your product, doing it again. And the way pre launch that you know is basically you get to a point where you no longer have any major complaints about the prototype that you’ve got and people start saying, “well I can really see this as useful, you know, when is this going to go live; I could really see using this.” So it’s not as black and white and as hard as like a retention number but that’s how you basically know. You can see the progress as you do wave after wave pre launch.

Sean [00:27:00] It makes sense, right; you’re scratching the itch. If your product is solving the problem they’re going to come back and use it again and then you know you’re on the right path.

Dan [00:27:07] Right.

Sean [00:27:08] We call that that one of our loyalty metrics, right. They’re coming back; they’re using it. That means they’re somewhat loyal to the product.

Dan [00:27:14] Right. Yeah.

Sean [00:27:15] All right so the tagline of your book, the subtitle, is How to Innovate with Minimum Viable Products and Rapid Customer Feedback. I think we beat up the minimum viable product part of that so let’s talk about the rapid customer feedback part.

Dan [00:27:28] Right. Definitely.

Sean [00:27:29] And the innovation part, so how do you define an innovation?

Dan [00:27:32] Well basically I think an innovation would be like, “hey, we think,” I would call it like a product concept, like “we think if we came up with this value prop that is gonna meet these underserved needs for this target customer and we think this feature set, described by words, right, is going to deliver that value prop.” So then the last piece of it is, once we’ve got that all kind of figured out, our MVP hypothesis or candidate, the next thing then is we need to actually manifest it with the user experience, right. We need to do that anyway to build it. But as we’ve talked about, I’m a huge fan of using those prototypes that you have to do anyway, or using that mockups and designs that you have to do anyway to tell the front-end team what to build, to use those as a validation tool to go out and talk to customers.

Dan [00:28:15] And so all it takes really is a high enough fidelity and a high enough interactivity prototype to use. So prototype is a very generic word. It’s funny because back to the submarine world, the first use of prototype was in the submarine world. So you can imagine when you build a submarine, if you get something wrong, it’s really hard to change after the fact because it’s literally welded metal expensive components. So they actually would build a full scale wooden prototype, they called it prototype, of basically the whole thing. And then, as you know, they also have prototypes of submarines sitting parked at the dock so that you get to test drive the real thing. So prototype is just a representation of the real thing that lets you test it with customers and you can do rough testing with wire frames. Wire frames are good to see, “hey is there a key functionality missing, does the overall layout make sense, does the navigation make sense, does the architecture make sense?” And Balsamiq makes a great tool for that where, you know, it’s got built-in click-ability or tap-ability. If you add a menu item it has a place to say if, “OK if the user taps here, where is it going to go?”

Dan [00:29:21] And then the next level up, that’s good for rough stuff, the next level up where you can really get a lot of great feedback on where I spend most of my time is in clickable or tappable mockups, basically. And so here now you have higher fidelity. You know, a lot of times the wireframes are grayscale and they’re rough and they don’t have the images and that’s on purpose, right, because the other thing I talk about a lot is the iceberg of UX design and everybody fixates on the visual design, right. So the number of times I’ve taken like high fidelity mockups to a key stakeholder and like the first thing they say is, you know, the team spent all this time fretting about target customer and needs and MVP feature set and value prop and we show the stakeholder the mockups and they go, “What’s this color of green you used here, I don’t really like that green.” That’s like the least important thing at that point. So wireframes tend to be grayscale scale so you can just avoid that and kind of put horse blinders on that. But then you get to high fidelity. Your designer exports some high fidelity mock-ups from Sketch or Photoshop or Illustrator and then the cool thing is using a tool like Image and you can create a good enough experience where you go and you create these rectangular hotspots so that when somebody clicks or taps there it goes to the other screen. And you can’t string together everything; it’s obviously not a fully interactive product, but you can string together what we call the happy path.

Dan [00:30:33] And so I love testing there because you learn so much. And again, you haven’t done any coding. And in the book and in my talks I talk about a case study where I did that and we were able to pivot super quickly we just tossed out the old design started from scratch and took everything we learned from the first round of tests and we were able to iterate very quickly and significantly improve the product/market fit. So that’s the general technique that I like to use is, you use those. You do one on one tests with customers. I like to do waves of like, you know, 5 to 10. Eight’s usually a good number but some people want more data points and some people at the end of day; it’s funny I talk about Oprah versus Spock where Oprah is the master of qualitative 101 interviews and Spock is the master of logic and analysis: the quantitative. And some people have such a Spock bias that even 10 users, like, “it’s only 10 users,” you know, “it’s only 10 users, how do you know, it’s not statistically significant.” They get tripped up on statistical significance. And what I love to tell them is like, ” well if 9 out of 10 people couldn’t figure out how to register, you don’t say, ‘stop let me get ten thousand more data points to make sure my chi squared calculation works out,’ you probably have a problem right, you probably have an issue.”.

Sean [00:31:38] Right.

Dan [00:31:39] You got to be kind of mindful of that and there’ll be time later. Once we launch then we can get, you know, hopefully we have hundreds or thousands of customers and then we can look at larger sample sizes.

Sean [00:31:49] All right. So me test my understanding of your answer to my questions there. So you’ll know you have an innovation when your customers agree and that you’re getting engagement. You’re moving towards your ultimate goal of adoption and traction, right? That’s how you know you’ve got an innovation and you got to innovate and you’ve got to rapidly get customer feedback so you can continue down that path towards more innovation. It all sounds a lot to me, it relates to the Steve Jobs quote that everybody overuses about, “my customers don’t know what they want until I show it to them,” right.

Dan [00:32:20] Right, yeah.

Sean [00:32:20] You gotta get something out there to test.

Dan [00:32:23] Right.

Sean [00:32:24] And this plays really well with your… You’ve got to diagram out there about problem space versus solution space. So I’d love to open it up to have you talk about that a little bit.

Dan [00:32:33] Yeah I mean related to that, you know, it’s funny because there are Steve Jobs quotes out there and people always say, “well Apple, they’re notorious for not doing customer research,” but if you look at some of the quotes he’s gotten that I use in my talks and in the book, likem they’re very customer-centric and they’re very, you know, it’s not technology looking for problem, a solution looking for a problem, it’s sort of like, “yeah of course we want to know what the latest technical capabilities are but we have to figure out how those end up solving a problem for someone.” The other thing that comes up that people use a lot is the Henry Ford quote: “if I asked people what they wanted they would’ve said a faster horse not have ever envisioned a car.”

Sean [00:33:06] I love that quote but I have an insertion there. He didn’t actually say that.

Dan [00:33:09] Oh okay. Okay.

Sean [00:33:09] If you do the research on that quote… Some some clever consultant figured out that there’s no evidence that he actually said it, but it’s a great quote anyway.

Dan [00:33:18] Yeah yeah. But the point is people like to misuse that quote to say, “well why do we need to talk to customers, they’re not going to invent a car for us.” That’s when people misuse that quote in my opinion.

Sean [00:33:26] Right.

Dan [00:33:26] And I say, “Yeah of course not because they’re not electrical engineers and mechanical engineers. They’re not software programmers or designers. It’s ludicrous to expect your customers to just tell you the next product to build,” right.

Sean [00:33:39] Right.

Dan [00:33:40] It doesn’t make any sense.

Sean [00:33:40] If it were that easy we’d all ask our customers what they wanted and we’d all be building the exact same thing.

Dan [00:33:45] Yeah exactly, right. You know, the whole trick is, you know, your job is to figure out the answer to all those key questions of: who’s our customer? What are their underserved needs? You know, how can we meet them in a way that’s better or different? What functionality and UX design is it going to take? Your job is to do that and so you’re making all these hypotheses and assumptions in the problem space about needs and customers but you can’t test those. The way to test those with customers is to actually, then you create a solution space artifact, ideally a prototype, or a live product, and that’s what you test with customers basically. And then you get the learnings from that and then when you do? You have to go back and revise your problem space hypotheses and assumptions and then iterate and create a new solution space artifact.

Sean [00:34:25] I love it.

Dan [00:34:26] Yeah so it’s like this dance. You’re doing this dance, you’re using these mockups and prototype as a way to get feedback but then you’re going back and revising your mental model basically, right. You know, again they’re not going to tell you how to improve the UX design. “Oh yeah, this flow is bad, you gotta do this.” You’re just going to see they’re getting stuck at point X and now it’s up to you to figure out the best way to do that. So along those lines the thing about the car and horse quote it is a lot of times people say, “well we’re doing disruptive innovation; customers can’t even have a point of view about this thing at all.” And I’m like, “that’s baloney,” right. It’s baloney. One, so many people think they’re doing disruptive innovation when they’re really not. They think they are there; they don’t understand what disruptive innovation means. Which basically, it means if you look at the importance and satisfaction framework that I have, whatever the high watermark is today for satisfaction, like whichever product is the best at meeting a certain need, right.

Dan [00:35:14] Disruptive innovation means you basically recalibrate that scale and what used to be a 10 out of 10 on a scale is now like a 5 out of 10 on that scale or 2 out of 10 on that scale. That’s what true disruptive innovation means; when you achieve like order magnitude levels of satisfaction that are higher than the current solutions provide to that need. Because the needs and the problem space don’t change anywhere nearly as quickly as the solutions. Solution technology waves come and go relatively quickly; they can. The example I listen to book is taking the need that people like to listen to music on the go. That’s a need that a lot of people have. They’ve had it for a long time. The first solution to that was probably like the portable FM transistor radio that was battery powered. And then we had Walkman, right. And then we had CD players, right, and then we had portable MP3 players, then we had the iPhone, or the iPod. Now we all just use our phones. So you’ve got five or six technology waves but the fundamental problem didn’t change at all. So the bottom line is you should be able to articulate how is your potentially disruptive innovation creating higher levels of satisfaction? What need is it doing a better job on? Everyone else is doing this in 10 seconds and the new technology lets us do it in 3 seconds, or for everybody else the cost per unit is 200 bucks but this is gonna let us get a cost per unit of 40 bucks. So even if customers can’t envision what your disruptive product is going to be, you should be able to articulate what the customer value is going to be to them.

Sean [00:36:29] I want to pull on something that you said for the audience here and that is the point of the product team is figuring this stuff out.

Dan [00:36:36] Mhm.

Sean [00:36:36] That’s the whole point of developing a custom software product in the first place.

Dan [00:36:39] That’s right.

Sean [00:36:40] So that’s powerful.

Dan [00:36:42] Yeah. And that’s why, you know, the best products are created by the best teams.

Sean [00:36:46] That’s right.

Dan [00:36:46] And what does that mean? It means not only, you know, the first thing everybody focuses on is individual skill level. “Oh, they must have a great designer, they must have a great front end developer…” That’s part of it, but you don’t need the best. What’s really important is, how are they collaborating throughout this whole journey? Because it all starts with discovery at the beginning, right.

Sean [00:37:05] Customer empathy, right?

Dan [00:37:06] Yeah, and that’s how you discover those unmet needs. You have hypotheses about unmet needs and then you go and you test and validate or invalidate them. And it’s great, you know, specifically problem space solution space, you know, the designer’s job is like, they should be involved in that so they know, but then their job is to help explore the solution space. I think that’s another thing that not enough teams do is, assuming we wave a magic wand and say, “you have a really really really good understanding of your customer and the problems and the value prop,” that’s all problem space. The next important phase is, “great, now what is the best solution to meet that?” And if all you do is come up with one mockup and pursue that then you haven’t really explored the solution space at all. And obviously going from zero to one mockups is great because zero mockups gets you nowhere, but what happens is people don’t do deliberate divergent thinking and at the end of day talented designers and design teams, that’s their main value add is, “OK, given that we have a clear understanding of the problem, let’s explore the solution space, you know, we could go with this kind of a mental model for the UI, we could do it this way, we could have a menu, we could have a drop-down,” that all those different ways of doing it. You know, there’s value in solution exploration then obviously it’s required, you know, you want to have talented people developing it as well. So again, like teams that excel at that journey of getting clear about hypotheses and how do we test them and explore the problem space, explore the solution space, explore the tech solution space as well; “OK given this design what are the ways we can implement it now with the technology? Should we use this framework or that framework or which back end database should we use?” So I think good teams, you know. It’s less about, obviously they need to have a certain level of skill, but it’s really about how much shared vision and collaboration they have throughout that journey to build a product.

Joe [00:38:46] Makes a ton of sense. Well we’re coming up on time so let me ask a question I think might be fitting for us to close with just before our last standard question we ask. You know, we talk about MVPs and what do you build, what don’t you build, how far do you go, how do you do it with? Can you just talk a little bit about the value of saying no as a product manager?

Dan [00:39:03] Yes. Yes yes yes yes, definitely. It’s really important. And the funny thing is, you know, I like to joke about the product managers’ motto and how it’s like Spider-Man’s motto. So Spider-Man’s motto, as a lot of Marvel fans know, is like, ‘with great power comes great responsibility.’ So I like to jokingly say that the PM motto is similar but different, which is, ‘with great responsibility comes no power.’ So the funny thing is that a lot of times no may seem a little harsh for a lot of product managers, so just to be clear, even if you don’t say no is like, “not now.”.

Dan [00:39:29] Not now is a knife euphemism because the reality is that the life of a product manager, even outside of just defining what the feature set should be, is there’s way more ideas and things that could be done than you have bandwidth with to do. That’s true with your engineering team, right. You only have so many full time Dev equivalents and the same thing with just your time, right. That’s just the reality of it, so by definition there’s more ideas. So it’s not saying no forever, it’s just saying, “not now, not for this MVP, not for V 1.1, not for this time frame,” right. But it’s really important because, if you don’t say no, and I work with a lot of teams where they don’t, right. My favorite definition of strategy is it means saying no.

Dan [00:40:07] And there’s another Steve Jobs quote where he’s talking about innovation and he says ‘innovation means saying no to a thousand things.’ He’s like, “people think focusing means focusing on the thing that you’re working on but focus actually means saying no to all the other things.” And he says ‘all those thousands of other good ideas,’ so he’s not throwing them under the bus and saying they’re bad ideas, he’s acknowledging that those other ideas are good. And so I like to take his quote and just say “strategy means saying no.”.

Dan [00:40:32] You know, if you think about the counter example, not to pick on our prized sales friends, but they have a quota; they have an incentive, right. So if they’re out talking to a prospective client and the client goes, “well is your new product going to have feature A?” What does the salesperson say? They say, “Oh yeah definitely, yeah, just sign that three million dollar contract, oh yeah it’ll definitley have feature A.” They come back to HQ. “Hey guys I signed a two million dollar contract, we gotta put feature A in there, you gotta do it.” Then the next salesperson goes to Customer B, right. And then customer B goes, “well is it gonna have feature X in there?” What do you think they say? They say yes to everything because that’s their incentive, right? So saying yes to everything is the opposite of being strategic. The definition of being strategic means it’s a choice that’s not easily reversed because if you can just change your mind the next day then it wasn’t really strategic; it was just a decision. It wasn’t a strategic decision, right. And so back to saying no, again to tie back to something we said earlier, MVP is the toughest part to say no. It’s one of the tough spots to say no, right, or “not now, not for the MVP; I’m not saying we’re never going to build it but it shouldn’t be in the MVP.” And again you can use a hack where it’s like, “well why don’t we do a test of a wireframe or mockup without it instead of just biting the bullet and building it.” So I agree saying no is important, and it’s tough. I mean it’s no to “this is the target segment we’re going to test this one first, this is the highest one first.” It’s no to, “well we’re going to include this other set of needs in our value prop,” right. “Not now, not for V1,” and it’s no to the feature set.

Sean [00:41:54] Love that. The more you’ve defined who it is you are actually serving with your product, the more you’ve defined who you aren’t serving.

Dan [00:42:00] Right.

Sean [00:42:00] And the more clear you are about that, the more focused you can be on what problems you are solving and what problems you aren’t solving. And that’s how you really can focus on building a powerful product that will be a true innovation in your market space. That’s good. Good answer. Thank you.

Dan [00:42:15] Yeah. No I totally agree. I think the key the trick is are you building the mental model internally to know when to say no and when to say yes? It’s not about arbitrarily say no or yes or just kind of using your gut. It’s like, “can I explain why I’m saying yes to these three features for MVP and I’m saying no to these three features for MVP.” That’s the key I think is building that mental model based on the evidence and testing your hypotheses over time.

Sean [00:42:40] All right very good. Well we have one question we ask all of our guests and that’s what book are you reading now or that you’ve read recently that you would recommend to our product audience, people that are passionate about software products.

Dan [00:42:51] Mhm.

Sean [00:42:51] What do you think?

Dan [00:42:52] Well I have several. I’m gonna not say no, I’m going to have a big MVP. If people are intrigued by the ideas I’ve talked about I would definitley recommend my book because it goes deeper into that.

Sean [00:43:00] Of course.

Dan [00:43:01] The book, specifically to answer your question, the book that I most recently read that I would recommend other people is Jake Knapp’s book Make Time. It doesn’t have anything to do with what we talked about today but it’s just a great productivity book. The audio book is especially engaging. Jake and his co-author JZ read it; it’s only five hours long. I thought I was a pretty productive person but they definitely gave me a lot of tips to be more productive. And then two books I recommend related to the topic we talked about today: I’m a big fan of Laura Kline and her work. She’s actually written several books. Her first book UX for Lean Startups is a really good book on this front and then Steve Portigal has a book called Interviewing Users. He’s a user researcher that thinks a lot has a lot of advice on the user interviewing front, which we didn’t even get to. We didn’t even get to how to conduct great interviews so definitely check out his stuff.

Joe [00:43:46] Wow. Thank you so much Dan. This was amazing. I got a lot out of it. Is there anything else you wanted to plug?

Dan [00:43:52] I have a website. It’s my name, dan-olsen O-L-S-E-N dot com. I post all of my talks up there. I don’t blog a lot but occasionally I will. What I do is every year I do a roundup of the top product conferences. So if you want to see the lay of the land for product conferences and find one to go to, they’re up there; my speaking schedule’s up there. And then for people in the Bay Area or if you’re visiting the S.F. Bay Area, I mentioned that monthly product speaker series tjat I run called Lean Product. You can go to meetup.com/lean-product. You can join the group for free and that way you’ll get notified when there’s a new event.

Sean [00:44:20] All right.

Joe [00:44:21] That sounds great. I mean I’ve been following Dan for a long time; I get a lot of value out of his content so definitely highly recommend checking him out and I hope everyone enjoyed this.

Dan [00:44:29] Great. Thanks a lot, Sean and Joe, it was great.

Joe [00:44:31] All right. Thanks Dan!

Joe [00:44:35] Alright so that’s it for today. Thanks for listening and we’re not going to just talk to talk, we’re going to walk the walk, so we would love if you go into your podcast product and leave us a review. Sean and I guarantee and are committed to reading absolutely every piece of feedback we get there, and not only that, but you’re helping other listeners. By getting their feedback in there it helps us move up the search rankings so other people can find the episodes. So thank you everyone.

Like what you see? Let’s talk now.

Reach Out