1. Library
  2. Arrow Icon
  3. Instrumenting Your Flywheels w/ Amplitudes John Cutler
  • Product
FEB 10, 2022 - 26 MIN

Instrumenting Your Flywheels w/ Amplitudes John Cutler

Intro

In this DevGuild: Dev-First GTM presentation, Amplitude's John Cutler offers a founder-focused blueprint for enabling hyper efficient PLG at your early-stage startup.

READING MODE
Outline
  • Instrumenting Your Flywheels
    • What I've Learned: The 5 Steps
    • Asking Yourself the Important Questions
    • The Importance of Focus
    • Step 1: Pumping the Brakes
    • Step 2: Clearing Your Head
    • Going Down the Rabbit Hole: Don't be Afraid!
    • Step 3: Converging on a Qualitative Model
    • Example: The North Star Statement Exercise
    • Step 4: Measurement and Tracking
    • Step 5: Prioritize Areas of Uncertainty
    • Summary and Final Thoughts
Transcript

Instrumenting Your Flywheels

My name is John Cutler. I work at a company called Amplitude as a Product Evangelist. I have a background in product management and UX research. I'm not a developer, I've worked very closely with developers for a lot of my life. I tried actually being a developer and my nickname was Pseudo. So then they just told me to step away. So that's my admission for the day. If you need to reach me after this talk, feel free to reach me on twitter @johncutlefish or at john.cutler@amplitude.com. If you have any questions. And so let's jump in.

The topic for today is instrumenting your flywheels. And this is what I've learned from doing hundreds of workshops now. Many of these workshops are with developer-first, tech companies or people who are thinking about developer go-to-market motions. So I'm happy to share what I've learned.

What I've Learned: The 5 Steps

In this talk, we're going to go over five key steps to thinking about instrumenting your flywheel. You'll note the talk is also about measurement and also a little bit about metrics and models. But we're going to try to get you through these particular goals.

So an overview of the talk, first I'm going to try to encourage you to pump the breaks. I know this is a pretty exciting time, but step back a little bit. Second, how to get the thoughts out of your head about your strategy and your assumptions. Third, converging on a qualitative model, which will help you instrument your flywheels a little later. Fourth, we get into the meat of it with measurement and tracking. And then fifth, okay, now what? How do you prioritize areas of uncertainty and where to focus?

So having been there myself at one time, and then also talking to a lot of founders at this particular stage, this is a very exciting time. Assuming that there's some uptake of your product, you're getting good feedback. It's like a roller coaster where you're just starting to crest that little bit of the hill and you can really feel the momentum pulling you. Very, very exciting time.

But there's one thing that's really interesting about this time, as you probably know.

Asking Yourself the Important Questions

Now there are questions everywhere. You're asking yourself questions, your investors are asking you questions, your customers are asking you questions. The questions on this slide are very, very realistic with minor differences change.

For example, Nice Job questions: "This customer went for 5K a month, how many other companies are out there like that? Why did they go for it? Why them? What was the magic moment of your product? How can you replicate this? How can you make it repeatable?"

My favorite, Good Job questions: "wait, they were super early adopters, weren't they? Are you having any luck with those non-super early adopters?" That's one of my favorite. "Where are people getting stuck? Are they just playing around? Are they just kind of get it going in their environment and not really using it? What should we actually charge? How do you compare to THE benchmarks?" We've all heard that one.

So you're getting peppered with questions all the time and you're asking yourself these questions. And this is a very difficult spot to be in, the excitement's there, but then all these questions are coming out and you're like, "Oh geez. How/what do I do now?"

The Importance of Focus

Importantly, the problem here is that, you also are in a position where you really, really, really need to focus. If you get spun around trying to answer all these questions, you will not focus on your product and your company and that could be very difficult.

And if we can't get you to

the spot where you're able to say, "I don't know right now" about a hundred times a day so that you can say, "I know X one time a day,"

you're going to have a difficult time focusing. It's like one to a hundred ratio.

So hopefully this kind of sets the stage for maybe where you're at. I've been there, a lot of the people that I do these workshops with have been there before. And so I know how it feels, and hopefully these steps will help you at this moment.

Step 1: Pumping the Brakes

So the first thing you're going to need to do is to pump the brakes and step back. Like I said, it's a pretty exciting time. And what you are going to desperately want to do is find answers immediately. And anyone who pretends to have the answers for you, you're going to gravitate towards those folks. You're going to want to know what everyone is doing. You're getting all these questions and you're going to look for answers immediately.

And the reality is, just a couple days ago, I was on with a VC firm and they're asking, "Well, you know a lot about this measurement stuff and you're at Amplitude. What are the magic PMF metrics? What are the magic benchmarks? You wrote the North Star Playbook." Which is something that I did. "We need exactly what this company should have. Can you figure it out for us?" And I had to break it to them that

there are no magic product-market fit metrics, no magic benchmarks.

There are some SaaS benchmarks that are reasonable but no magic sort of product behavior benchmarks.

Typically, if you're going to copy someone's north star metric, are you exactly like that business? So, that's an interesting question. So, I hate to break it to you at this part, there is no silver bullet. But, one thing I would add is,

this is why there's an opportunity in your business.

So whenever a founder comes and says, "I thought everyone had the answers for this and they don't. And I'm suffering an existential crisis at the moment." I remind them that if there were answers to all these questions, you might not have the opportunity that you have. So we're going to get a little bit into that. But I think at this moment, you need to just accept that there's no silver bullet answers. And then you need to start doing the work. And so that's what we'll talk about in some of these next sections.

Step 2: Clearing Your Head

So step number two, you need to be able to get clear and get your thoughts and assumptions and beliefs and hypotheses out of your head into some kind of form that you can look at them and think about them. Now, this is critical. If you have a co-founder and a founder, and it was just a handful of you working on this, there was so many beliefs that were implicit in the room. And you probably functioned really well with that being the case, just a lot of implicit beliefs.

I was laughing the other day with a founder and they said, "Oh, we did a lean canvas, but I haven't looked at it now for three years because we've just been flying at this point. We didn't need to revisit all that stuff." But as you start to get to this inflection point where you're getting a lot more questions all the time, you are going to need to make them a little bit more explicit. And frankly, they may have changed since you did those original activities. So these things are important.

Now we're going to start with really basic questions. But one thing I want to let you know is that, a lot of people are very resistant to going down the rabbit hole now. They feel like the more complicated they make it at the moment, somehow indicates maybe they don't know what they're doing or that they don't have a clear plan. Definitely at this point, I encourage you to make it safe to go down the rabbit hole and expose all the beliefs and assumptions you have. So let's start with some basic ideas and basic questions.

So I'll often meet a team and they say, "Well, John, what should the magic metric be?" And I will say, "Well, you know your product strategy, what would you expect to observe if your product strategy was working? And if your product was working for the personas that you're building the product for?" And then they'll just jump into it immediately, "Well, we would expect them to see it like deploying in these environments and doing this and be more productive. And we would expect them telling their friends about it. And we would expect all these particular things."

All of that is data. That is a hypothesis. And at this point, it's probably safe to say too, that you're not going to have a huge sample of people who are maybe doing all these behaviors. This is a little forward looking at this moment. But it's a very important question to ask yourself, if the product strategy was working, what would you expect to observe?

These are all like little games and tips that I use during the workshops that you might find valuable. I'm not telling you, "definitely do this activity." I'm saying these can be valuable.

So the second one I really like is quite standard at this part in the process, but someone is going to ask, "What's your ideal customer profile?" And people go through these various activities to do it.

I like to keep it really simple. I'll say to someone, "You're on LinkedIn and I don't want you to get a thousand search results back. I want you to tell me how aggressively you could target, such that you would get a group of 100 people. That if you put them into your product, maybe 30-50%, you could convert those people and they would become passionate customers for what you're doing."

So the reason why I take that tactic is because teams tend to want to keep their available market really open at this point. And they are very cautious about hyper focus. But when you're trying to surface your beliefs about what your ideal customer looks like, or what the signals might be, and you need to instrument for and measure, it helps to actually be really aggressive. Like who are these people?

Going Down the Rabbit Hole: Don't be Afraid!

Now I mentioned, it's not bad to go down the rabbit hole, I'll show one technique that can be helpful to think about your customer journey. There's a standard idea of the funnel, that a customer journey is a funnel. At Amplitude, we actually prefer to think of it as an engagement ladder or a ladder of steps where if you ever played the game shoots and ladders, you can go up a ladder and then you can drop down the ladder and there's various things.

So I'll walk you through this, just kind of zooming in so you can see. So, like we were talking about with that LinkedIn example, but before they've even encountered your brand or product, what are signals that we might observe about them? And you can imagine at this point that there's this kind of initial activation step in your product, that's a little bit of a flywheel. But, what are we observing? What's spinning up at that point? What variables are leading to other variables, are leading to other behaviors? And at a certain point, they hit this kind of early aha in your product where it starts to click.

Now, you'll notice the percentages here, so the way I like to think of it is, maybe 10% of the people at this first step, we believe will become loyal customers. Now at this stage, maybe 30% of the people who hit this aha, we would think 30% of these people might become loyal customers. And you can see how this kind of continues. There's little off ramps. So, there's people who stall out kind of early, so that product wasn't right for them. What were the signals? What did they do? What did they encounter?

Now the people who deepen their engagement in the product, now the probability become a little higher, maybe 30, 50, 70% of these people will become loyal customers. What do we observe when they deepen engagement? So it's like another flywheel here.

And then, you have this kind of idealized, loyal customer advocate, super fan, the product strategy has worked. Now, even then you can lose them. Even then, sometimes it's actually your most passionate customers that it's the hardest to take. Like, you thought they were shoe-ins to be a customer for a long time and a year into it, they're kind of churning out. You don't need to necessarily do the activity just like this, but you can see that

getting anything you can out of your head is helpful.

We won't go too in depth into this particular flywheel. But this is actually from Amplitude. And you can see, this is not a simple model, but it surfaces our idea. The more usable data is, the more access to data you get. And then there's a little flywheel here about improving the data literacy of the organization.

You can see that that kind of ends up in insight quality. And that improves decision quality and decision speed. Which improves outcomes. And of course, an outcome that no one knows about is actually not all that great. So as insights quality improves, the ability to measure outcomes improves, visibility of outcomes improves and so on and so forth.

Now what I'm trying to show you, I'm almost shock in awe technique at the moment here, but I'm trying to let you know that you can't go too deep at this particular point. You're really trying to surface any and all assumptions about how you think growth happens with your product and what makes ideal customers and what the ideal behavior is. Do not hold back at this point, because this is where you're trying to get all the beliefs out of your head.

So to kind of summarize this particular step again, don't be afraid to go down the rabbit hole. If you feel like you're making stuff up, that is okay. Because later on, we're going to prioritize where you need to reduce uncertainty. So if you feel that like, "Oh my goodness, this is all over the place. And I don't really know what's happening here." Again, just see uncertainty as a signal.

And the really important point here is you'll have advisors, investors, or whatever. And these are a great resource to reach out to. But there's two ways you can reach out to an advisor. You can say, "Well, what's the answer? Is it share with three people or share with five?" That's maybe not a great example, but you get the idea.

More importantly is if someone has had experience in the space that you're in, is ask to have them walk you through their thought process and their assumptions and their hypotheses at your stage of growth, so you can learn from them. Because at this stage, everyone wants the answer, but you really need their help in helping you go down this rabbit hole and wrangle with uncertainty at this moment.

Step 3: Converging on a Qualitative Model

Cool. So now you've gone really, really broad. Now we start to converge on what I would call a qualitative model. We're not really at a quantitative model at the moment. One could think of this almost as a mind map or just some boxes and arrows with words in them that help you start wrangling what you believe. Extremely important is that often at this stage, people say, "Well, I don't know if we could measure that. So we're not going to worry about it."

Now I was working with a team in the security space. And there is this very qualitative element of trust and a sense of security and competence. And they did this activity and they just kept surfacing that qualitative idea. But then, the data person kept coming and said, "We can't measure that. So we're not going to worry about that. Our investors will never buy that. We have to think about other things." I say that that is not right.

If you start with powerful ideas with imperfect measures, we'll trump perfect measures with less powerful ideas.

So I want to impart that thought on you at the moment. That at this point, when you think of converging on a qualitative model, do not censor yourself by whether you think it's perfectly measurable at the moment.

Example: The North Star Statement Exercise

So let me give you an example. I kind of adapted this a little bit for a developer-first play or where there's some hypothesis around a growth loop or a flywheel around developer adoption. So I really like using fill in the blanks, it's very helpful for teams. So at this point it says, "our path to medium to long-term sustainable and defensible and differentiated growth is a function of our ability to ___."

And so a lot of the Dev GTM ones will be things like, "use something in the technology to wedge ourselves into some kind of workflow and help developers do something and then make a step change in their productivity." This is a huge kind of glossing over the problem, but there's many variations of something which basically amounts to, "use the technology to become ingrained in a company's workflow, empower developers, make them more productive at something. And then that will create the loop that we want."

Importantly, in the north star statement, we actually clarify some inputs. So maybe the inputs here is, "well, one thing we need to do is shift their mental model when it comes to a way to approaching some kind of technical challenge." Maybe not the early, early adopters have that problem. But certainly later on, you have to maybe shift mental models.

Often they'll have an input, like, "well, they got to get stuff into production, for real. Or they'll have something." And then, you basically have this adoption loop. So you have to help these early advocates of the technology or the thing you're doing, rope in other developers and the thing. 'm simplifying on purpose.

Again, this is like a qualitative model. We express it in words before we start to measure it. One thing at this point, that's pretty challenging, as well, is that often a founder will say, "But we're not totally unique. Are you asking us to make everything unique? Is this bad if our stuff is not unique and we're copying some of what other people are doing?"Absolutely not.

So, if you're forming the qualitative statement of what you're doing and you find that some of the things are like, "Well, that's what our competitor is doing. And that's who we're trying to disrupt is doing." That's not a problem. I ask every team, "What are you striving to be best in the world at?" To figure out that kernel. If you're're going up against these big companies, you better be best in the world at something, hopefully.

But then I remind them that if you're best in the world at one thing, that a lot of other things you don't necessarily need to be amazing at. I showed you this pretty generic example, maybe one or two of these inputs would be the same across different companies. But ideally, maybe something up here in the north star statement would be something unique. That comes up a lot and it's important.

Step 4: Measurement and Tracking

So great. You've done something to form a qualitative statement. You could use a mind map, there's many ways to do that. But you're able to express all those beliefs you went really broad on. And you're able to get more clear on those. So now we get to measurement and tracking. At this point, every company has a question. And there are two broad paths to figuring out what to instrument.

There's one which I would call the more traditional self-service kind of BI analyst approach. Which is, "we have to list all our questions out. And then we need to define all our success metrics and then we need to create dashboards. And then from there, we will create a tracking plan. And then from there, we will generate insights."

Or what I recommend. I found at this point, you will have so many questions and every question will inspire other questionstThat instead I have the team focus on, "what are the key actions and the things that are happening that you care about?" Now, I do suggest that they come up with some representative questions, but I don't belabor that point. So we're looking for the nouns and verbs and adjectives and adverbs for their particular product. And from there, we form a tracking plan and then insights.

I'll show you an example, so you see what I mean. So this company here with a logistics product, described in plain language what the customer journey was. And then from there you can see, they started to template create, template save, template published.

It's a myth in the market that you can't figure out what you need to track ahead of time.

You can get 90% of the way there just thinking in a disciplined way about the journey. And then having a good product like Amplitude or wherever that can derive insights from those core events. So there's measurement and metrics. They're not the same thing. And if you get the core events and actions tracked, you'll be in good shape.

When it comes to models, I won't go too deep into this point, but this is a qualitative model for a big bank. At the end, they focused on minimally viable measurements. So they went back to their map of events and said, "Okay, part of having a bank is funding accounts effectively. Part of funding accounts effectively, what's the minimally viable measurement? Did we track those events? Yeah. Okay, we did."

So it's a way to pressure test the events that you've tracked, but turn a model into serious things. The main point I want to make with this slide is, measurement last, minimally viable measurement based on kind of how your qualitative model is really the way to go.

Often at this point, someone says, "Adding the events is difficult. We don't want to bother our engineers with it." The problem with that view is, first of all, these are like single lines of code, this is not difficult. Second, is that there's this impression that there's this kind of linear relationship that you have to instrument an event to get an insight. Actually, it's a little bit more like a Fitbit in the sense that even 20 events about the key nouns and verbs in your product can unlock thousands of insights.

If each of those have five or 10 properties connected to them, that is a vast number of things. So I think in Amplitude, we have 20 insight types. So multiply everything, 20 insight types by 20 event types, each with 10 properties. You can kind of see what I'm getting at, that the technology is there to create a lot of great insights if you have the core data. And so, crap in crap out, you want the stuff to be right. But don't obsess about getting every question right.

Step 5: Prioritize Areas of Uncertainty

So finally, let's bring this to a close for step five. Now you have to prioritize areas of uncertainty and go. This is where the focus, focus, focus happens. So the technique that I use with teams is to ask them, "Why do you need to learn anyway?" And in this case, you'll find it's either to inform decisions, to reduce uncertainty about assumptions, monitor progress, or make a model. Which we've been discussing.

So I would ask everyone watching this, "What's the key decision you need to make?" And often it's something like, "Do we keep going depth of functionality in a certain area? Or do we focus on the adoption or broader adoption in the company?" So that is something like a decision you need to make.

Again at this point, it's really helpful to unpack and go deep a little bit. So for example, this company had to decide between a platform play or being best in class in a bunch of different areas. They kind of unpacked the areas of performance and decisions and assumptions and models that they were building. You can see here's like a little bit of a flywheel effect that they were trying to measure.

So it helps at this point to cast the net to make sure that you're prioritizing things. Then at this point, you have to just make decisions, you have to focus. So, where are your potential prioritized areas of uncertainty that you need to make? Is the decision to focus on persona X or persona Y the most important decision? You can put every other decision aside, bring that decision in progress to make that decision. And then you're using your analytics and your measurement stuff to do that.

At this point, people get bogged down on the perfect metric stuff or the perfect insights. The discipline to focus on one decision you need to be make, make it with the available data that you have and increase confidence is really where it's at for teams.

Summary and Final Thoughts

So to summarize for today, when it comes to instrumenting your product and instrumenting your flywheel, first thing, pump the brakes and step back. Don't get caught up in this swirl of easy answers when everyone's overwhelming you at the moment. Second, get it all out of your head. It might have been implicit before when you founded the company, but you kind of need to revisit these assumptions now that you know more.

Step three, take all that stuff that you've gotten in your head and can you explain it qualitatively? Could you explain what you're making in words before adding measurement and metrics to that? Step four is, think about what's happening in your product that you care about. So instead of getting wound up on the perfect metric, rhink about the nouns and verbs and adjectives and adverbs in your product. And instrument those to unlock the long tail of insights. And then finally, focus, focus, focus. What is the most in important decision you need to make? And what's the minimal amount of data you need to make that decision?

And so with that, hopefully this was helpful. And always reach out with questions, either Twitter or my email, we'll share that maybe with the show notes for this. Thank you.