Listen
Podcast: Play in new window | Download
About
Kate is an entrepreneur+designer and Principal at Intelleto, where she creates visual explanations that make complex ideas simple, memorable and shareable. Kate pioneered the UX learning track at Tradecraft, co-founded Luxr.co, and was Senior Practitioner at UX consultancy Adaptive Path. She co-hosts the NSFW podcast What Is Wrong With UX with Laura Klein, tweets at @katerutter and blogs at intelleto.com.
How do we know our work is working? In other words, how do product designers know their work product is solving the problem it was intended to solve? These are the kinds of questions that keep us, and Kate Rutter, up at night.
“It’s an insidious question,” says Kate, designer, tech junkie, artist, and Principal at Intelleto. In this episode of the Product Momentum Podcast, hosts Sean and Joe chat with Kate Rutter about metrics, but not just those that only measure performance. Kate says the true power comes from our teams’ alignment around metrics as a very tangible element that people can get behind. “It gets really exciting when you…start to see metrics as human behaviors with your products stated in numerical terms,” she says.
Metrics are important for everyone from business executives to designers. Kate mentions Joshua Porter’s quote that, “your metrics will be as unique as your business.” It’s not only tracking data, but tracking the correct data, that will tell you if your work is working. Listen to hear some specific tools and examples you can use to measure the success of your product, as well as tips from Lean thinking.
Podcast: Play in new window | Download
Joe: [00:00:00] All right. Well on today’s episode we have Kate Rutter, and Kate, we’re going to have you introduce yourself. Today we are really going to talk and focus on metrics. What are they? What makes for a valuable one? How do you know if you’re using the right kinds? What kinds to avoid… What do you think, Sean?
Sean: [00:00:15] We’re big on metrics here at ITX. So we have our own kind of flavor for how we deploy them and we’re interested to learn more from you.
Joe: [00:00:23] Data, right?
Sean: [00:00:24] Data.
Joe: [00:00:25] Kate, hello.
Kate: [00:00:27] Hello. It’s such a pleasure to be here.
Joe: [00:00:29] We’re very happy to have you. Would you mind introducing yourself, where have you worked in the past? What are you doing now? What’s your role?
Kate: [00:00:36] Sure. So I have been a longtime UX generalist. I was director of technology for a non-profit fairly early in my career. And that was when this web thing was just starting to wake up and I saw a lot of potential there and so I started playing around with it. And since then, because I live in the Bay Area, I always thought I might leave and go somewhere else, but the world of work just kept getting more interesting and so I’ve followed that path of curiosity. A couple of things you should know is, I’m an opportunist, which means I often find myself working at the beginning of something that’s happening that’s kind of big, like the Internet, or at the time desktop publishing was even part of my background…
Joe: [00:01:20] Oh…
Kate: [00:01:20] Yeah. User experience, as a field, we evolved from hand-coded, you know, “look I can make this thing,” to, “what thing should we make and how does that affect people’s lives?” Designing more for impact. So what I have is a rich set of experiences and an extraordinary set of people that I learn from, that I’ve gotten to work with as clients and as as colleagues in a variety of ways. But I’ve never worked for a big company so I’m not sure if that’s in my path or not. I’m curious about that. But I did spend quite a few years at a company called Adaptive Path which was a consultancy really heralded for pioneering a lot of terrific thinking and user experience design space. And that was a life-changing experience and it really took me from thinking about this world of digital technologies as things and technologies that were interesting ways of kind of how the technology functions into really thinking of technology as always in service to people and how people can realize their dreams and find their aspirations and discover new things about efficiencies or being effective or connecting. And that, I think, is a life changing shift, but it was also a shift in the practice. And so I’ve been very lucky to be able to be part of that shift to the practice. So after my time at Adaptive Path, I co-founded a startup with two colleagues, Janice Fraser and Jason Fraser, a married couple whom I’d known for years. And this startup was called Luxr and it started out as a program to help early stage technology startup founders really integrate user experience and user-centered thinking into how they think about their products. And this was happening at the rise of Lean Startup as well. Janice and Eric Ries know each other. And there were all these different ways of maybe, not short-cutting, I don’t like that phrase, but really being more efficient and validating ideas before putting a lot of effort into the development or design of those ideas. And that was an important stake in the ground, I think, for our field as well. So we had a good couple-year run on Luxr, and then as many startups do, we gained foothold but not scale. And so then I really was excited about finding behavior change in our field through individual practitioner behavior. So many people I know focus on scale and organizational change and team change, but I really am in the seat next to you as a practitioner. I think that change comes when we change our individual behaviors and how we think about design and that that infectious kind of approach of design can infuse a team and then infuse a company. So since then I’ve been really working as a design educator.
Joe: [00:04:05] Interesting.
Kate: [00:04:07] Yeah.
Sean: [00:04:08] That’s great. You recently talked about, in some interview I read, about unicorn hunting. I think this plays along with the individual practitioners, and anyways, you want to talk about that a little bit?
Kate: [00:04:18] Sure. Yeah. So as part of where I spend my time, I teach at an institution called California College of the Arts. It’s a fabulous fine arts institution that about a decade ago started investing in interaction design as a practice. And they have a real focus on social impact there which I think is important especially for designers now. So I spend my time teaching as an adjunct professor in their interaction design department. And then I spend another part of my time with a colleague Laura Klein who is kind of a known name in the lean user experience world. And she and I have a podcast together where we drink and we pretty much bitch about what’s wrong with UX design, but also how to make products suck slightly less. And you know, we harvest all of our past experiences and our observations and then we debate about it. We share a lot of deep fundamental values but we really disagree on how those values come to play in human behavior and in teams. And so we fight about it and we made this podcast on it. And we did one on unicorn hunting specifically for teams or managers or hiring organizations that are looking for someone who can do it all, who can, you know, deliver pixel-perfect visual designs as well as validated code that can launch and be deployed. And it’s like, well, you know, the specialization of our field over the past couple of decades is really allowed us to do many many more magical things, so expecting one person to have that breadth of skill, there’s just not enough time for it. So we talked about how unfortunate that is when good people can’t find a good fit in an organization because of this unicorn myth that we have.
Sean: [00:05:54] Right, and it takes a balanced team with lots of different personalities and lots of diversity, this is my opinion anyway, and lots of different skill sets to really build great products.
Kate: [00:06:06] Yeah. That’s the exciting part, right, is we have so many people coming from different adjacent fields with their own experiences and expertise and the problems they want to solve and that’s amazing. And getting a group of people who are really different to work together is a total pain in the butt. It’s hard because you have to find that shared ground of why we’re here. And then, I mean, even if you can do that, which is its own challenge, you also need to find a place where compassion and empathy allow for an effective workplace where you can understand each other, listen to each other even if you disagree, use your dynamic of disagreement to make the work better but not to piss people off, but also to allow people to change over time. And that’s asking a lot. It’s asking us to be different humans than we were when we showed up, we did a job, we punched a clock, and then we left and went home to our quote “real life.” And so this is a challenge. But I do believe, I agree with you, that more breadth and more diversity of experience and diversity of perspective is more likely to help us solve the hard problems that our products are now in the position of solving.
Sean: [00:07:13] Which brings us back to goals and metrics. So I believe, here’s my theory about metrics, is that they’re only really useful if they align people. So if you have the right metrics and people are aligned around those metrics then the diversity becomes more valuable because we’re all going towards the same set of goals, right.
Kate: [00:07:32] Yeah I think that’s an astute statement. I I love listening to your podcast because the way you all phrase your own perspectives and hypotheses with your guests is always very succinct.
Sean: [00:07:43] Thank you.
Kate: [00:07:43] I’m like, “oh, I would like some of that succinctness.” Sadly I do not have that gift yet. One of the things though, about that alignment around metrics, is it’s an element and it’s a very tangible element that I think people can get behind but it gets really exciting when you as a team and as a company start to realize that metrics are behavior, human behaviors with your product, stated in numerical terms. And so by committing to a metric and aligning around a metric the hope is, and I think the deeper purpose, really aligning around the the intent of your product in your customers lives. And because words can be so facile and hard to wordsmith or hard to really lock into our goals and the why statements and the purpose statements, I think metrics can be a more specific and clarifying tool for teams that are trying to rally around similar goals.
Joe: [00:08:38] So as you know Sean, metrics are great to rally around. But if it’s not the right metric it can be dangerous. So Kate, what do you think makes for a valuable metric? Like how do you know where to even start with knowing what metrics to use or what not to use?
Kate: [00:08:51] I love that. You know, I’m going to back up a little bit and tell you how I came to this work because many of my colleagues would be like, “it’s odd that you’re talking about metrics,” and I wouldn’t disagree. But when I was at Luxr with my co-founders, one of the elements that was so important early startup teams was figuring out how they know if their work was working and if some experiments they were delivering or the product content they had was solving a problem. And we immediately looked to the world of measurements to help validate that and to figure out if was something there. And the world of metrics, you know, there’s a ton of smart people working on that. I mean metrics and business performance have been out there for a long time, right. So I don’t in any way profess that I have a big, broad picture of all of that expertise. I think that would be impossible. But one thing that it did help us do is realize that, I hope every designer and every person that works on a product that doesn’t have an effective metrics program in place, I hope the question that keeps them up at night is, “how do we know our work is working?” And that’s such an insidious question. And the more I heard it from the startup founders, the more I started to internalize it for our own work and it was a question that I had to start to reconcile with. Because delivering an interface or a design or a feature or even releasing a product was no longer sufficient. That wasn’t done. It was, “what has changed in the world and the use of our customers as a result of that release?” And so when we talk about the right metric, it’s kind of like saying, “well what’s the right personality to be successful?” There is a wide range of things that can work for you, but the right personality to be successful is the personality where you feel you’re the most you.
Joe: [00:10:31] Right.
Kate: [00:10:31] Right, like you feel you can truly be yourself and be that best version of yourself.
Sean: [00:10:35] I love that.
Joe: [00:10:35] And it’s personalized too.
Kate: [00:10:35] It is. And so here’s a quote that I love which is from Bokardo, who is Joshua Porter, and he says, “your metrics will be as unique as your business.” And that’s the kind of metrics that I’m talking about, especially for product success. So your company can have metrics related to sales or growth or market adoption or customers et cetera. But when we’re really getting down to the product level, when you phrase a metric, it should be specific enough that just by hearing the metric that you’re measuring, people should be able to tell what your product does. And so that’s kind of more the right metric. And unfortunately metrics are so tightly held that I don’t have a lot of like numerical, specific examples because they’re just not… That’s my next phase of research is finding the companies who will open up about that. People are very cagey about their metrics because they’re so tied to their business performance. But I was working with a team and their sales numbers were totally off the charts. They were doing great. They had a product line, consumer electronic, that was doing super super well. What they found when they started to look at it was that people were buying the thing but they were not using the thing. And all of a sudden there was this kind of flash of cold water, they’re like, “well okay, so let’s project how that might play out in the future, everybody buys this thing, nobody really uses it, and now it’s sitting around in your house unused, what do we think the endgame is going to be on that?” And it could be just a very quick kind of collapse of that business line, right. Because at some point people get smart. Like we pay for a gym membership because we’ve got all the best aspiration and goal to go every week to the gym, and after three months you’re like, “Why am I paying for this, I haven’t been once.” And everybody cancels in March because their New Year’s resolution didn’t get them to the change in behavior. And that’s the kind of fragility that I want to help our teams avoid. So when you’re talking about the right metric, it would be the kind of metric that’s intrinsically connected to the purpose of your product and your customer’s life and in a way that’s instrumented through the product interface or behavior so that you can start to count the use of that product in that way and then hopefully improve it intentionally over time.
Joe: [00:12:40] Got it. And so, you know, I go to these home pages a lot of times for products just to learn about them or whatever and I see all these big numbers flashing in front of me that seem to be very impressive. And so back to your point there, how do you know when these metrics, and they’re called vanity metrics, they sound good, they look good, they make you think the companies being really successful, so you can maybe define a vanity metric for us? Because I loved the way you stated it before when we were talking in a prior time.
Kate: [00:13:05] So this comes from the Lean Startup management philosophy of, there’s vanity metrics and then there’s actionable metrics. And I credit Eric Reis with that. There’s also some really nice practitioners working in Lean Analytics that lean heavily on this separation of, kind of, church and state. A vanity metric is a metric that only ever goes up over time and it doesn’t change your behavior. Like you can’t do anything about this. So when I talk to startup founders, they’re notably proud often of top-of-funnel metrics that grow over time like, “we had over 10000 downloads,” or, “you know, our average time on our site is 30 minutes or 20 minutes,” whatever. Or, “we had 450 new updates this week.” And the thing about those numbers is they’re positive in that they help you feel good. And we as humans need to feel good because our work is hard. They can also influence stakeholders, they can influence investors, they can influence the public markets; so they have a real utility because people want to follow success. I mean they have to be honest, right. You can’t lie about your numbers, that’s fraud. But they do grow over time. What they they don’t help with as design and product teams is they don’t help us understand if our product is working for our user. So you might have a real peak in adoption and get ten thousand downloads, but how many people have taken that download, opened your product, actually started to use it, and then came back to use it again to where they become a habituated user? And those are the numbers that tend to be much less sunny and so people don’t like to talk about those as much, but as product teams we have to look in that mirror and see those real actionable metrics, things that will change how we affect our behaviors for our products.
Joe: [00:14:44] Yeah I think one of the telltale ways to spot a vanity metric is when it’s the total of something with no timeframe.
Kate: [00:14:49] Right! Which brings us to, so well if that’s not a vanity metric, then what’s an actionable metric?
Joe: [00:14:56] Right.
Kate: [00:14:56] You know, one of the things that I really do in my practice is take the knowledge that I think people are exploring out there, because there’s a lot of people working on metrics, and try and make it into something that an everyday practitioner can integrate into her or his practice. And so on this one, I’m going to use the definition of actual metrics that I learned from Lean Analytics which is a book that was published by Alistair Croll and Ben Yoskovitz; I’ll probably mention it a couple of times. It’s a fabulous book in that it does talk about the different stages of maturity for an organization or company or a product and help you identify some things that might be crucial at different time phases. But they were the most succinct I think in defining that actionable metric which has some certain characteristics. It’s clear and specific. It is normalized, and I can go into more detail about that. It’s comparative, so there’s time differences, you can say, “what was it before, what is it now, what’s changed?” It’s actionable meaning that the change in that number helps you and your product cohort make a difference in your product, actually do something about it. And then it changes your behavior, right, you actually want to do and need to do something about making that measure, that metric, move. And so those are the attributes of it and where I think we’ve been struggling is, especially UX designers, product people I see much more well-balanced in their metrics fluency. I think developers, and certainly data analysts and people for whom data is their creative material, they’re really well versed in this. But what I see is a real differential between designers being able to integrate metrics behind those goals that those metrics represent into how we shape our actual product behaviors, the features we decide to work on, the purposes of those features. That is what I see as a real gap. It’s like we have expertise on one part of the market and we have expertise on the other part of the market, but what we don’t have in our companies is this consistent team commitment to metrics that change our behavior and help our products improve. And that’s the gap I’m hoping to fill.
Sean: [00:16:50] I like how you said earlier about how your metrics should be about how you’re improving how your product is used. It’s not like just increasing usage or, like we said earlier about the vanity metrics, it’s really about understanding the purpose and fit of your product and understanding how we’re going to measure that it’s actually being used to productively meet that fit, if that sums it up.
Kate: [00:17:12] It does sum it up. There’s a very small, or I think it’s growing and a huge focus, point between the nature of the metrics that people start to capture. Common metrics would be often around adoption. A good framework, especially for startup founders, is Pirate Metrics by Dave McClure at 500 Startups. And so it’s this categorization, it’s almost just categories, of types of metrics that you might need as your product starts to evolve, or if you’ve got a product already in the market, that you might try to better understand at a deeper level. And pirate metrics is: acquisition, activation, retention, referral, and revenue. And there’s been a lot of good play about that, you can Google it, it’s a great term. And the reason it’s called Pirate Metrics is the first letter of each of those spells AARRR.
Sean: [00:17:57] Aarrr!
Kate: [00:17:57] It’s kind of this pseudo mnemonic thing. But of all of those, I mean to grow a product you need to increase acquisition, of course. To hook people in or to ensure that they’re connected for usage, you need activation. The real thing that I think drives overall product in long term company success is really retention. How do we keep people? How do we ensure that what we’re delivering them is meaningful and valuable so that they will stick around? And then when that’s a positive cycle, you hope that there will be referrals. And then, I mean unless it’s a hobby, you gotta have some revenue and you want that to kick in. So really it’s that retention line. And looking across multiple frameworks for metrics, the one that most of them have is retention. And that has been where I’ve seen a whole heightened attention on retention because that’s the element that demonstrates customer ongoing loyalty, success, trust loyalty advocacy to use your own loyalty ladder that I know that you use as part of your business.
Joe: [00:18:58] It’s the regulars at the bar.
Kate: [00:19:01] Right, yeah.
Sean: [00:19:01] So the loyalty latter is very similar to the Pirate Model in terms of flow except it’s from the user’s perspective, like how is the user exhibiting behaviors that we would indicate as trust, loyalty, or advocacy.
Kate: [00:19:13] I love that. I think that’s a terrific categorization and place to rally around.
Sean: [00:19:17] Yeah and in the long run if you don’t tie that back to your ultimate metric, which is profit and revenue, then you don’t have a business.
Kate: [00:19:24] Right. And that brings up the other point, which is something that, again I focus a lot of my messages on design teams, but that also means that they need to work in well with product teams and with broader business units. But the business metrics can be very distinct from the product metrics. This is one of the examples that just irks me, but I’m not going to change the world in a day and I hope there are other smart people working on it, but the bestseller list. Like that is such a publisher business metric, right. But we’ve condoned this best seller, like, “Oh I’m on the bestseller list. What’s the bestseller?” as this proxy for readership, right. We assume that if people buy it, then they will read it. And what would I really love to see instead is like the most read list or the most read and lent. That, I think, would be a much more meaningful metric for authorship. Maybe not so for publishers who are in the business of sales, but for authorship or for people who are convening ideas that are important and meaningful, I think readership and lending-ship is a much more astute behavioral actionable metric. But we don’t measure those because we are so tuned towards a business metric that we haven’t really developed the same level of sophistication around usage. But now with our products and our digital products, we not only have the opportunity to do that, I think we have a mandate too.
Joe: [00:20:41] So we’re talking about metrics a lot. Just to make it a little more tangible for the audience, you know, we talk a lot about big companies like Facebook and Twitter and Snapchat. When they do their quarterly earnings calls, you can get some insight into the metrics they care about and a lot of times you’ll hear about like MAU, DAU, it daily active users, for example, and it’s really one of those metrics where you’re like, “yeah I get why people measure it but is it really the best metric?” So can you talk a little bit about an example, whichever one you want, of what’s a good metric versus a great metric. How do you take a metric and actually make it much better? Because something’s better than nothing in a lot of cases.
Kate: [00:21:17] Right, for sure.
Joe: [00:21:18] But then how do you make it those best ones, like you were saying before about making it normalized and all the other factors.
Kate: [00:21:23] Right. So to go back to Josh Porter, a great metric is going to be one that is unique to your business. But I actually have a step-by-step continuum that teams can use to start from basic or mundane, unhelpful metrics into something that’s truly awesome. And to do that, there’s this concept of, “well what is our product do for people?” And again, this doesn’t necessarily work up to the business metrics as effectively as it does to work down towards more detailed things, like, you have a product, you have features, you have interactions. And so this level of questioning really helps forward building the right thing for the right reasons. And at Luxr we created this term of a key use. So what can someone do with your product that they can’t do without it? And unless you understand that, you don’t have the clarity to be able to identify a metric that would be meaningful for your product to measure. So that’s your first step is to have a hypothesis about what it is that that product, that feature, that interaction does for your customer that helps them complete something, experience something, or have a specific behavior. And interestingly enough, we call it key use, but this level of thinking is all over the place. There’s a concept of core action which has been popularized by Josh Elman at Greylock Partners. There’s critical event which some of the more analytics platforms are starting to use. But there’s this concept through all of those that there’s this thing that people need to be doing in your product that you’ve gotta measure, and that’s the first step. So let’s say, let’s walk through, continuum, for an example. So the example that I use which is pulled out from a variety of the startups I worked with at Luxr is a consumer mobile app for task sharing: task management, sharing it, seeing things completed. And the key use for that is to share a task and to confirm that something was done. And it’s super simple. It takes a while to kind of get to that level of simplicity. So a metric that would be unhelpful for that product would be, like, sign ups. First of all, it’s a category and it’s just, like, saying people might have given it a try but it doesn’t actually provide any utility or use to them. A vanity metric would be total number of registered users because that’s only ever going to go up over time, it’s not comparative across time periods, it’s not normalized so it’s a pure number; it might be 50, it might be a thousand, it might be a million; but it doesn’t tell you anything about the behavior. Starting to get good would have some of the elements of the actual metrics. So the percent of new users per week. Now that would be an acquisition metric, right you’re trying to grow revenue users per week, but at least it’s a percentage so of your entire user base, what percentage are new, and we’re going to measure that on a weekly basis. And that adds in normalized and it adds in comparable over time. Something better than that would be the percentage of users who sign in or who interact with the product three or more times a day, per week. So now you’ve added a specific behavior that your product needs to instrument or event and you’ve got to capture. You want to know the percentage of users we’re really using your product quite a bit within this time span of a day. And then you’re gonna measure that weekly. Now all of this, as you can tell, gets exponentially harder just to keep track of the numbers, much less the timing that the events would fire and how you visualize that which is its own challenge. But we’re getting closer. Even though a lot of products have habitual use, people use them multiple times a day. So that still doesn’t tell us what’s unique about our business. So something that would be awesome would be the percentage of users who share a task, three or more times a day, per week. So if you can capture the numbers at that level then you can start to say, “how do we move that number up so that we can get closer to 100 percent of our users sharing a task three or four times a day?” Now it might be that users don’t have that many tasks to share. So are there other things that they could share that would really involve them with your product and help them get more utility out of it? That again is a user research question and user research is the place where all the best metrics come from because they’re unique to that task. So that would be a continuum, and most teams start out with like, sign ups or total number of users, and that’s a fine place to start. You have to start somewhere. But when you really are able to nail that core action, sometimes on a per feature basis, and measure it, and start to adjust it and change interactions so that that number moves over time, that’s an extraordinary skill. That’s kind of the Holy Grail.
Joe: [00:25:38] That was a great example. Thank you. And Sean, I think we’ve probably had every guest talk about this so far, about user research, we should probably just rename the podcast to Do User Research.
Kate: [00:25:47] That’s funny. You know, when Laura and I talk, the thing that we always end up with is task flows because I actually think that that is one of the elements that makes the designers so much different and better than others. Like task flows is it, and of all time our podcast on task flows is like our number one download. So yeah, you could redo the user research.
Joe: [00:26:06] It’s off brand, Sean I’m sorry.
Sean: [00:26:07] No, no worries. I agree. So one last question then we’ll begin to wrap it up here. Short term versus long term metrics. Is that something that you’ve thought a lot about? So as we’re talking a lot about technical sort of product metrics like how people are using it, but over the long run is there something different you might measure or think about measuring?
Kate: [00:26:26] I haven’t thought about it in those terms but there was a really interesting insight that came out of work that I did of when I was at Adaptive Path. I was on a project with Jesse James Garrett who’s a fabulous practitioner. He wrote The Elements of User Experience. And we were working with a team that was looking at where they wanted the roadmap to go. I have a lot of opinions about road maps, but that’s off the topic so I won’t go into them, but projecting like, what would allow this product to grow in healthy ways that would really benefit the customer, and of course enhance the business possibilities? And Jesse came up with this fascinating thing, he called it the More statement. And for that specific product which was Gumtree, so it’s similar to like Craigslist for the UK. The More statement was that overall for the product and business to be healthy they needed more people making more trades or exchanges for more money with more other people more often. That’s a lot of mores, but what each of those allowed us to do as this workshop team was take those more statements and say, “OK more people, what kind of more people? Who are your people?” And making more trades, “is there a cap on the number of trades or exchanges or sales people would make? How do we up that? How do we help them think about Gumtree in a way that allows them to think, ‘oh this is the first place I go anytime I need to get rid of something or acquire something.'” And for more money, their platform had some basis of the amount of money, but when you get into big-ticket items like cars or boats or big things, it didn’t feel like that was the place to do a high level exchange like that. So how could they evolve into allowing those types of exchanges or trades? And were people mostly exchanging with people in their neighborhood or were they doing things at a longer distance, so how could they make exchanges with more people? And then how could people do it more frequently? And sometimes that’s enough of just doing some outreach and notifications to tell people your product’s still there if it’s not something that has a habituated daily use. And so each of those little more statements kind of created this population of how could growth or invention could happen in our product that we could capture more of the market or better serve our market and then start up what you might consider that loyalty ladder of then trust, loyalty, and advocacy? And that was an interesting way to get at that long term thinking and to almost, in some ways, predict what kinds of metrics you might want to capture over the long haul as a comparative from now to then.
Joe: [00:28:46] And that’s where Moore’s Law came from.
Kate: [00:28:49] Well said, well said.
Joe: [00:28:51] Just kidding. That’s a joke, Google it if you need to.
Kate: [00:28:53] That’s right.
Joe: [00:28:54] So, you know, with metrics it’s all about having the data to have the metric and what we see happening, which is on everyone’s mind is things like machine learning and AI. How do you, or do you, see that playing a role in how metrics that companies care about or that they should care about see that evolving in that discussion?
Kate: [00:29:14] I do and I’m not quite sure how. I’ve been dabbling a little bit with the research around machine learning. I know there’s huge amounts of opportunity there. I’d say I’m a novice in that, but eager, and it’s on that threshold. So it’s that next level of thing that I want to explore. I think right now where we can look at is the rapidly changing landscape of analytics packages. At the minimum, you know, there’s Google Analytics, but at a much more sophisticated level they’re really starting to be more attuned towards specific product events and how we capture events and being more integrated almost with a development flow of a product. So I’ve been poking around a little bit with amplitude, and I don’t advocate any one platform. I do know that if you just take Google Analytics out of the box like it’s all vanity all the time, right.
Joe: [00:29:56] You’re right.
Kate: [00:29:57] It can do anything. But without knowing what it can do, you don’t really get the utility out of it. But amplitude has a really nice white paper and they define that kind of key user correction as this critical event and then look at how that can be instrumented into your overall thinking to enhance retention. And I think they’re putting some nice thought leadership in it. Obviously it grows their market so it’s a good investment. But for people who are in a team where the quantitative expertise is already high, I think looking at those analytics packages, they’re all starting to say the same thing that UX teams and product teams have been saying, which is, “our product has to work for the people and when we measure how it works for the people our products can improve much more dramatically,” and then the sales are the outcome of that. So that’s kind of the virtuous cycle I’d like to see kick in.
Sean: [00:30:44] You know, along those lines, I like to say that the right metrics can unleash creativity in all of your unicorns that you have working on your team.
Kate: [00:30:52] Oh hells yeah. Yeah. I do quite a bit of workshops, also with Laura Klein, and she has this fabulous thinking method. She calls it, “walk a metric up and walk it back down.” This is a common situation, I’m sure you all have been in it before, which is, someone who is highly influential, it might be a client if you do agency or consulting work, it might be a product person or a business person in your own company, and they come and they say, “I want this thing.” Like, “I want this feature.” Jared Spool calls it, “someone’s going to give us a big bag of money if we build this thing.” And the question is is what does that thing represent. And it’s not that people are stupid, like we ask for solutions because they’re specific and they’re knowable and that’s how we think as humans. But what it is is it’s an opportunity for any team to walk that question or that request for a feature through a set of inquiries and then come back with other new ideas that might better solve the problem that feature represents. And we all need help and guidance with this. So from the feature, like if you say, “we want product recommendations,” or “we want video for our e-commerce site.” Whatever it is, you say, “well what is the outcome, like when we have that thing how well our product be different? Well if it’s product recommendations, we might have more sales per customer.” It’s like, “OK so if we have more sales per customer, how might we measure that in quantitative terms?” And there’s probably going to be discussion about that, but you might come up with something like, “well there’d be an increase in the average order size, OK, so in order to increase the average order size what would that look like in our product in a very specific visual way?” And the behavior of the human is that they’re going to check out with more items in the cart. So if you’re really looking at that behavior, then you say, “how could we, in our product, help people checkout with more items in the cart?” And it might be that the feature that would best be worth exploring or experimenting with it doesn’t take nearly the effort or the focus that creating brand new product recommendations does, because there is a cost to that feature development: technical cost interface cost complexity cost for your users. So maybe it’s a free shipping threshold, maybe it’s an add-on product, maybe it’s 10 percent off a second or third or fourth item, maybe there’s some business logic that you can invest in that doesn’t take a development effort that would satisfy that behavior and therefore move that metric. And that’s the type of thinking that I think all of us on teams need to get very well versed in without being a pain in the ass about it. Like, without pushing back and saying, “well that’s just a feature, why do you need that?” Like we don’t deserve to be shirty, we deserve to have open and empathetic inquiry into the real purpose that people say when we speak in solutions.
Sean: [00:33:32] Great advice. All right, last question I promise, and you can use one of the books that you’ve recommended earlier, but we always look for, like, the book that you’re reading now or you’re most likely to recommend or that you have recommended or given as a gift. Just kind of collecting data on people in our space, what you’re reading and what you’re up to so we can share that with our audience.
Kate: [00:33:52] Sure, I’d be happy to. I love me the book so I usually have a pretty full list of them that I’m reading. My favorite one on this topic is Lean Analytics. There are a few other quantitatively-focused books, most of them are about measuring usability, which is distinct from measuring the use or retention, and those might be helpful for specialty teams, but for a generalist mindset about how numbers can amplify and affect your work I think Lean Analytics by Alistair Croll and Benjamin Yoskovitz is a must buy. I give it to all the teams that I work with. I use it myself. I think that it’s great. The second book, which I would love to pump up a little bit, but full disclosure I was involved in its production, is a book by Laura Klein, Build Better Products, and for a generalist team that’s looking to really focus on growth and doing the right kinds of work instead of getting caught up in the trappings of interface details or, “is it pretty?” I think it’s a very powerful book. I provided the illustrations for it. Laura doesn’t sketch so I had a hand in that, but the reason that I participated is because I feel so strongly that she’s got a terrific take on how product teams can be more effective.
Sean: [00:35:01] All right. Well thank you for joining us. And thanks for participating the ITX UX Conference too, you were fantastic, we got great reviews for the conference.
Kate: [00:35:08] It was a fabulous event.
Sean: [00:35:09] Thank you.
Kate: [00:35:10] You all know how to throw a good event. I hope you do that one again because people should go.
Sean: [00:35:14] A little plug for the ARTISANworks in Rochester too, the venue was amazing, right. A neat place.
Joe: [00:35:18] Yeah.
Kate: [00:35:19] You know, having an event with a bunch of people for whom creative problem solving in a very creative visually enticing place was a real nice fit.
Joe: [00:35:30] Yeah it was nice. So for anyone who wants to follow your work, follow you, where can they find you? Anything you want to plug?
Kate: [00:35:36] Sure. I have a personal site, my little corner of the Internet, at intelleto I N T E L L E T O dot com. It’s a word that was coined by Michelangelo about the inherent intelligence of art in a material, so I really respond to that. And then I’m on Twitter right @KateRutter.
Joe: [00:35:53] Very cool. All right. Well this was a great discussion about metrics. We hope it was helpful. And thank you so much for joining us.
Kate: [00:36:00] It’s been a pleasure. Thank you.