Don MacLennan
Customer Success: The Art Of Upselling

Don MacLennan is CEO and co-founder of Bluenose Analytics, a solution for online businesses to drive user adoption and retention. Don has deep expertise in customer analytics and customer experience management and has worked in every stage of company: start-up founder, pre-IPO, post-IPO and big players like SAP and EMC.

Collapse
00:00:00
00:00:00

Introduction

Don MacLennan: Hi everybody, how are you? Glad you could come out. Like all things in San Francisco it's like a small version of the United Nations here, so it's great. As Sam said, I'd like to share with you a perspective on why conversion matters and the key assumption is, many of you are utilizing trial to pay, free to pay types of models, where the idea is you make a developer fall in love with your product and if you succeed in doing that they might just recommend the product to be used on a broader scale in their shop.

And all of a sudden, you go from something that is zero revenue to a lot of money very, very quickly on the basis of this successful conversion event. So it felt like, in the grand scheme of customer success, that was going to be maybe an interesting topic to double click on tonight.

I've got an esteemed panel and they're going to be happy to entertain a lot of questions from you, not just about tonight's topic specifically, but all things customer success in their world and they all come from you know, very different companies in terms of size and stage. So I really welcome lots of interaction from you. So the format's going to be, I'll give you a quick talk and we're going to spend most of our time in the panel conversation, where I'm going to ask them questions and then we'll leave a substantial amount of time at the end for Q&A.

So if you would be kind enough to hold your questions until the end, that would be most appreciated, but we have lots of time for Q&A both during the panel and even for that matter afterwards. So I'm pretty certain we'll be able to field almost any question you've got, okay? With that let's get started.

The Art Of Upselling

Conversion and up-sell right, seem to be this really critical moment and we're making the assumption that lots of Heavybit members are using this business model and in some respects, customer success at this stage in that customer journey, is a little bit of a misnomer because it's a bit more about user success, right?

That one individual is carrying a lot of weight in terms of their decision as to whether they like your product and the recommendation, or the purchase decision they're going to make on behalf of their company thereafter. So, provided you're not otherwise calling on that company, this user really becomes the center of your universe for at least a brief period of time and of course, upon successful conversion, things change again and someone like Ryan's going to have lots to say in terms of what happens post conversion, and how do you manage like a really big implementation?

So we can touch on that tonight as we go as well, okay?

Conversion is kind of a multidisciplinary thing in terms of getting it right. You need some actor in your organization who's going to manage that customer relationship as they go through this process, for however short or long it is.

It could be days or it could be weeks. You also need to think about product usability, because they are going to be in some self-sufficient mode given this model.

So the product's gotta do a lot of the work in terms of making it easy for that user to get to kind of their first moment of value or success. Alright, and then finally, we gotta think about analytics. What are we measuring and how's that driving our ability to convert them, but also on a broader scale, improve how we do it overall? And user engagement, which is to say, how are we going to actually nurture this user along some prescribed journey?

And as importantly, what are we going to do to intervene when they kind of fall off that course? Alright, that's as important as designing the happy path. What's the intervention when they tend to stray into the weeds? Okay so, that's a little bit about the topic tonight. I'm going to introduce the panelists a little later on after my prepared remarks, but we have some esteemed colleagues here, and they'll tell you a little bit about themselves.

So my take is that customers are really hard to earn. As great as a product as you are building it's still hard. It's still hard to successfully acquire customers. Once in a while here in startup land, right, we manage to get lightning in a bottle, wherein our products become viral through word of mouth or we just solve a really important problem that no other product does.

An Explosion of Choices

I would argue that, that is the exception, not the rule, and for every other startup that has to chart some course to success, it's just this long sustained, concerted, effort right? To earn those customers and retain them. So it's hard. In fact, I would argue it's harder than ever. So why is that? I think there's a couple really important trends out there. Customers have an explosion of choices.

This is just the mobile e-commerce space, right? If you go Googling, you find these so called, logo maps all over the place. Somebody took the time at a VC to compile one just for mobile e-commerce. That's not the world of e-commerce all together, just mobile apps that do e-commerce, of which there are hundreds. I'm going to guess that you know, there's at least two providers for every category of product and probably much more than that.

And you see this pattern across B2B and B2C, right? So we're all engaged in building B2B products, business products. This is the marketing technology landscape, right? Those number of logos are in the thousands, and they're actually expanding over time, not contracting. That particular marketplace is not consolidating, it's fragmenting, right? I mean, pity the buyer, right? Trying to figure out which of these vendors products to buy.

And pity the supplier like you all, in terms of getting the attention of the buyer in the first place, given so many choices, alright? And I would argue this is true across every single sector no matter what the technology product is, right?

The customer is facing unprecedented choice. It's a great time to be a buyer in the history of technology. It's a challenging time to be a provider. Right, as a result, you've got more competition than ever. Okay, the other really important trend is that

We are conditioning customers to bring their consumer behavior to the workplace.

And one of those behaviors that we're all doing at home and in our personal lives is we are really impatient and we want immediate gratification. Right, we do not want to be onboarded right? Such that it takes us hours, or days, or lapsed time, before we gain access to a product in that moment of first value.

In fact, we start thinking about this process in seconds and minutes, and that's about all we're willing to invest in order to get some return, okay? So Slack is a highly successful product that I would probably assume virtually everybody in this room is using that product. That's what you do to get onboarded into that product. Alright, that's all you do. No credit card swipe. Literally you can get your entire organization up and running on Slack in minutes if you're small.

One of the panelists told me they converted to Slack as a larger enterprise you know, over a weekend. Unheard of, because it's their core messaging platform. And we know Slack to be not just a place where people communicate with each other, but where systems communicate to Slack. So imagine all those integrations. Alright, doing that cut over in just a weekend. This is obviously a product right, that's going to give the customer immediate gratification.

Sales is Getting Harder

Because that's as simple as it takes to get started. Alright, and not to put too much of a dark cloud over this conversation, but it's not getting easier. These trends are all accelerating, right? Any given sector you're in is probably not consolidating, it's probably fragmenting, which means, to the extent the buyer used to have dozens of choices, they're now faced with thousands. Right?

We as suppliers used to control the buying experience. You'd have to sign up for a demo, we'd match you up with a systems engineer, the sales person would qualify the crap out of you as to whether you had budget and need and so forth. All this gate keeping behavior, right? Now, nobody tolerates that, right? They want to free trial and buy. And they don't want to be hassled by anybody in a sales and marketing function.

They want to be able to do it for themselves, and they do it immediately, right? Obviously subscriptions. We used to buy software perpetually. Now we subscribe to it. Sometimes, not even by the year. Sometimes just by the month, and of course we can use some of these products for free for as long as we like, before we ever even make a first payment, okay? SaaS, integrating these products is easy, right? Restful API's have completely transformed how tech products get deployed because those API's are so simplistic it's now really practical to wire stuff up using standard API's.

It used to be, you'd send on a professional services team, right, go work with that customer, and build some glue code. It doesn't work that way anymore. Right, you publish at a restful API and other people utilize it, and push data around, and it's amazing, right? But that simplicity has been transformative. And then finally,

People want to go live in minutes and hours, not weeks and months, irrespective of the product complexity that they're faced with.

Right, so all of these trends unto themselves are accelerating, right? Any one of which would make it hard on us as vendors right? To convert and retain a given customer, and all of these trends happening in parallel just compound that, right? So it really puts this moment of conversion as a very high stakes endeavor, right? Therefor it merits a lot of time and attention on your part to get it right, or so I would argue.

Okay so, if you could only do one thing that would affect conversion and retention, you know, what should you focus on because the world of customer success is super broad, but in the context of tonight's conversation about getting free or trial users successfully converted so that you have the ability to sell them something later or more, right? What should you do in order to have the greatest effect on conversion?

And my assertion would be that it should be about successful user adoption, not account adaption, right? Not company adoption, but user adoption. What is that initial experience that constitutes success? Alright, as measured in minutes and hours and days to get them to the point where they're comfortable enough to make that purchase commitment thereafter, right?

And I would also suggest that. Don't think of a 30 day trial as 29 days of not doing anything, and then an attempt to convert on day 30, right? Instead, I would argue a 30 day trial is very front loaded wherein, the buyer is going to make their purchase decision very soon, within minutes, and hours, and a day, or two, or three, and the remaining 27 or so days, right, are not particularly productive from their point of view, in terms of what they need to do to reach a purchase decision.

Even when you have a longer trial period, there are really these formative moments that are much earlier in the life of that trial and much more immediate.

Listen, Learn Engage Framework

So I'm going to share with you a framework in order to try to deconstruct this. So if this problem is so big and so important for you to do well, what's the framework? So listen, learn, engage is what I suggest you think about. Listen is about, well, what kind of data do we really have to have in hand, to understand that particular trial or freemium user, as well as understand of course, the broader pattern of how people are converting, right?

Learn is how much sense can we make of the data? Alright and then engage is how do we take that data and apply it to how we engage an individual user, especially when they fall off the happy path? So what's the trigger by virtue of data, that's going to cause us to go intervene and offer them some support and assistance, get them back on the happy path and towards conversion?

Okay so, listen, learn, engage is the framework. And with that, we're going to double click on those three components of the framework, and I'm going to let the panelists tell you a little bit about how they approach those three competencies. Ryan, would you like to introduce yourself?

Ryan Hoskin: Hi, I'm Ryan Hoskin. I run our customer support, customer success, and renewal functions at PagerDuty.

Chris Stolt: Hi I'm Chris Stolt. I run customer success, which involves all the per sales aspects of customer success. I work for Heroku.

Maddie Blumenthal: Hi, I'm Maddie Blumenthal. I also run customer success at Rainforest QA, which is customer success and support.

Listen to User Feedback

Don: Thanks everyone for joining. Okay so let's double click on this framework, called listen, learn, engage, and chat a little bit about the listen part. Firstly, in terms of feedback. So I want to hear from the panelists, how do they collect user feedback in this period of time especially, right? In around the trial, and around the free period, or soon after they convert.

And how they use feedback to kind of answer these key questions, and maybe the other thing I'd ask you is, what have you learned from gathering that feedback that surprised you or drove your business forward? So who wants to go first?

Ryan: I can talk about the post sale life cycle in terms of how we capture feedback. So, on our customer support team, we use Zen Desk and we get our C-sat surveys, so, customer satisfaction. So after every ticket that's closed we'll engage with the customer, ask them how their experience was and based on the feedback if they actually give us a negative satisfaction rating we'll follow up with that customer and try to see that through until they're actually satisfied.

We also use Net Promoter Score. That's actually extremely helpful in trying to figure out the state of your customer's health. That is sent out as a survey. It touches our customers twice a year.

Every single individual end user actually gets that and the surprising thing about that is you're going to figure out tons of information about your product here. You're going to figure out who are your champions within the company. Who are the strong advocates within the company, who are the people that hate your product, what is wrong with your pricing model, what are the feature requests that people want to see?

So not only is that valuable to my teams, that's actually basically the source of information we pass along to our product team, our product marketing team. It's useful in all aspects of the business. So we take that information, that helps us figure out what are we going to do when we change pricing packaging, how do we find customer advocates that can be your source of testimonials and things like that. So all around it's a great source of information.

Don: Anyone else have feedback gathering programs either during the trial or thereafter?

Chris: So in customer success we don't use Net Promoter. That's something that's done I believe by our marketing team. Company wide, there are things relevant to customer success in there that's good information, but what we really do is, in support specifically, we ask "did the customer get their question answered or their problem resolved?" So we ask that at the end of every survey. It's just a quick thumbs up, thumbs down with an option to add comments. Was this ticket to your satisfaction? And we aim for a high 93% satisfaction rate.

Ryan: I'll also add we use a tool called Intercom to actually gather customer feedback when we're rolling out new products and features. So we'll have like a beta test, or beta list of users, and our product managers will actually directly engage with customers to figure out, did this solve your needs? What problems do you have that it didn't solve? And try to help shape how the product is actively developed as well as figure out like pricing and things like that.

Don: And are any of you collecting this feedback during the early stages of a customer life cycle such as free trial or early post purchase?

Maddie: Yeah so, at Rainforest we do it a little differently. We don't necessarily have a free trial. We do a paid proof of concept period. So the CSM's are extremely involved in the process from the very beginning. Before the deal even closes we come in just to scope out a use case, and then try to set some goals for the trial period, and the CSM is engaged throughout the entire proof of concept period.

So we're gathering feedback throughout. We do anywhere from one to three months time, and so we will just take that feedback manually straight from the customer's mouth and pass that along to product or sales, or whoever needs to know, once they convert to an annual customer.

Don: Who's got some interesting stories about the feedback they've heard from early stage customers or more generally from these survey programs? Without airing dirty laundry.

Ryan: I have an interesting thing in that we had one of our biggest customers actually request as part of their onboarding and training, that after the completion of that, they requested that they have an average NPS of eight. Meaning that everyone's at least. And so basically that's really interesting because we can validate the quality of our onboarding and training program by conducting that survey and it gives the customer a positive satisfaction that's, the tool's actually useful, and validates that they made the correct purchase.

Don: Great, cool. Anything else to add? Alright, well let's talk about the other type of listening, which is measurement. So in this case, what we're talking about it how is that person actually consuming the capabilities of the product, and I'll spend just a couple minutes framing this because measurement is really an interesting topic. Well obviously there are ways to measure your product in terms of how somebody might engage with the interface.

How often are they logging in and interacting with your product's interface? What are they doing in terms of clicking around with the main features that they're exercising and the like? So there's a user-centric way of thinking about collecting this data, which is just basically their interaction with your interface. There also however can be an equally important way to measure, which is, if they've set your system up to do something on their behalf and then it functions in some automated way, it's another dimension of measurement.

So I'll give you kind of an analogy. I think about marketing automation. So a marketer comes in and designs a campaign. So they're really active today because they're doing all the creative work of creating their email templates and so forth and they schedule that campaign to run next Tuesday. Well the measure of success of that campaign and therefore the features of the marketing automation tool have a lot to do with how many were successfully delivered versus bounced, unsubscribes, open rate, all these kinds of metrics are coming from this automated process that they designed, but ended up being the role of the system.

So you'd obviously want to measure the success of that marketing campaign according to those metrics, which had nothing to do with that person logging in to the marketing automation software on the day the campaign ran, if you get the difference, right?

System level metrics are as interesting as a way to think about product success and adoption success, as people's interaction with the interface.

And it had a lot to do with the nature of the product your building, but I really encourage you to think about both forms of instrumentation or measurement in order to have kind of the most complete picture of usage. So with that, how do you folks think about measuring usage during trial, during free, during pilot, and thereafter?

Maddie: So what's important at Rainforest is, in case some of you aren't familiar, Rainforest is in the QA space, so everyone deploys differently, QA falls under a different bucket, everyone has different quality goals that they're trying to hit. So setting really clear benchmarks during the proof of concept period is really important to us because if we have 10 different customers going through a proof of concept period, I guarantee it's 10 different deployments that we're dealing with.

So what we do is make sure that before the deal even closes, we set out really clear goals and we track towards those goals.

Don: Great, anyone else want to offer a perspective?

Chris: Yeah sure, so at Heroku, we support a lot of different languages. I think eight in total, officially. A lot of different people doing different things from enterprise customers to people just learning how to program. All the programming camps are usually using Heroku to teach people how to deploy and so, what we don't want to do is spend time with those users because we're not trying to convert them into paying users.

We're just trying to use the platform as exposure, but what we have is something called an HVA system. It's a High Value Action, and it tracks a lot of different things people are doing across the platform. So we're going to look for things like, when did they add production to your database? Or did they get a spike in their traffic? Or did they have a significant change in their spend, or any number of different things that indicate that they were moving, we're taking the tires in paid user land.

That's when we're going to engage because I think you hit on this earlier, especially developers, but people that they're trying the freemium product, they don't want to be sold to. They don't want to talk to sales and they don't want to talk to marketing.

So we don't want to reach out to them in a sales capacity. What we really want to do is reach out to them in a success capacity, or an advocate capacity and say, hey, we saw you do this thing, are you getting everything you need?

Do you have any questions right now? And we're effectively going to end up selling to them, but it's approaching it from a... If they just say, no we're all good. We're going to back off and let them continue to do their thing but if they say, oh we have all these questions, or we tried this thing and here we go. Now we have a really good high value engagement and the amount of conversions that we're getting, the percentage and the ratio there is definitely up significantly.

Ryan: Yeah we have a fairly basic behavior validation as they go though the trial. So within PagerDuty, like adding another user, setting a schedule, setting up an escalation policy. That's how we determine that it's a market qualified lead, and that's who our sales team will engage with, and that seems to work really well for us.

Learn From Usage Patterns

Don: Those two stories are great segways into what you want to learn. I think it's really important to know what the happy path is and in both cases, I think probably because you've been around for quite a while in those companies, right? You've managed to decode that. What are these clear signals that you know have got really strong correlation to that successful outcome? Of conversion and expansion for that matter?

Chris: I would add that it's equally important to really analyze that, and keep an eye out for the things that seem like they might be something that's high value, but actually aren't. So for example, in the early days of the HVA system, a lot of people were saying, oh, when they add a domain name like that, the custom domain name, that's a thing, and we're like, kind of everybody wants to do that and domain names are cheap and it might just be a blog.

That's not really an indication of anything as it turns out. So we moved away from that into things that prove themselves to be of value.

Don: Yep. Maddie, any sense of what the happy path looks like for TOC, or to your point, is it a little bit custom every time?

Maddie: It's a little custom every time. We're a little bit in an earlier stage than the two of these two guys, but it's just important to track everything. So every POC we do, every customer that comes through our door, we're tracking their engagement, what they're doing. It's a lot of trial and error, like trying to figure out what that happy path is to Chris' point is, you know, something you think is happy path might not be, and you'll find something kind of off the wall that actually determines health score. So we do a lot of playing are with every number and see what sticks.

Ryan: Yeah I mean, there can be some like, false flags. We had a customer where you know, they're triggering lots of incidents, they had all their users deployed and if you were to just glance at the account it looked like it was a happy customer, and our sales were at point on sight, and like, was totally caught off blindsided by the customer like, we don't know what we're doing.

All the notifications were just going to mailing lists and no one was actually doing anything with it, and we actually took that feedback and incorporated it back into how we actually measure a customer's health now.

Maddie:

Just because your customers are using your product a lot doesn't necessarily mean they're happy.

We've also learned that lesson.

Don: This topic can be really daunting, right? I've heard people respond to this need and say, oh my god, how do I even solve for this? Quick, we have to hire a data science team and we've gotta do this logistic aggression, and what are the variables and weak signals? And oh my god, you know, kind of throw up their hands and not even try. Because it just appears to be this monumental problem where they're trying to do it through data and machine learning and so forth.

For some companies, that's a valid undertaking but probably because you're really big and you can afford the resources to even try. So to get practical about it I think the advice was great. Just trial and error. Start with common sense. Start with a hypothesis. Validate it, obviously work on really scarce data with small customer bases right? So it's going to be hard to validate it, conclusively at least, to begin with, but it shouldn't stop you from trying, and it certainly shouldn't stop you from iterating.

And I can assure you that if you keep at it over the months and years and you supplement it with these live customer conversations where you're doing some you know, real direct observation that supplement the data you're looking at. You will absolutely discover this formula.

So I wouldn't let it be a daunting problem such that you don't start, and I'd also be not too hard on yourselves in the sense that, expect the trial and error, and expect it to take a little bit of time, measured in months and perhaps years before you really get it to click. So that would be my advice having spoken to lots of folks in these roles.

Chris: You know, writing it off. The way to start with rather than this really complex HVA system in place, it took years to build, and a lot of trial and error, but when we started it, it was one thing. It was, like did they add a production database? And that was it. And like, it was like the only thing in there, and we're like, "oh, it could be all these different things." We're like, yeah? But we'll get there. Like, let's just make sure that this is working and that there's value in it, and that's like sort of this really, really old Heroku mantra, which is everything's an experiment.

So when you're starting off, just treat it as an experiment. If it turns out that it's not working and you need to completely scrap it, it's okay.

Engage With Purpose

Don: Cool, so let's talk about engagement. We're measuring our users, we're asking them for feedback, we've got some sense of what the happy path looks like. We obviously might have a sense of what the anti-pattern looks like. What do we do with all that information? So how do you folks drive engagement in these periods? Maddie I think your blueprint's really clear. Maybe Ryan and Chris could add some perspective as well in their models, and then we'll come back to you.

Chris: So I think the HVA system itself is the thing that we're using to look for those things, and that's the engagement.

We're not reaching out to a customer and saying, "oh, you're using Heroku. Is everything going okay?" That not going to start a good conversation. When we reach out to a customer we say, "hey I saw you add a production database and you scaled your dynos up to production tier dynos and you started doing significant traffic. It looks like you moved something into production. How's that going for you?"

And if you just spend a little bit of time for us at least, to have a public facing website, clicking through, learning about their business, you're going to ask really important questions and have a really good engagement, and it feels, to the customer, very personable, and it's because it is. And that's our customer advocate team. That's what they're doing. And I think on the flip side to that, is more of a retention thing, right?

Like customers do fall off the happy path sometimes so that's getting them back on the happy path and grooming. On the other side, so we're working in the very early stages of the opposite of the HVA system, the NVA system, or a Negative Value Action. So we want to know like, how and when can we predict that a customer's like starting to fall into trouble, or they're sort of flailing, flapping around, and can we step in and help them.

Because it turns out that most of the time that we have spoken to customers and they've told us like the problems that they had and the things that they couldn't solve, we're like, those are totally solvable problems on our platform. If you had talked to us, we could fixed this. But it's not really up to them to come talk to us right? We need to be recognizing this stuff and reaching out to them. So we're trying to put the right stuff in place.

So looking, like for dips in usage, and a few other things. Like I said, early stages, when we can do that and engage and they're at the right time, before they're like, already made the decision to move off, and we can right that. That's going to like really turn the customer back onto the happy path.

Ryan: I totally agree with that. So when we onboard new customers their health score is basically zero, and we use that as a measurement to see how successful we are with our deployment, and also customers that have been around longer term, we use that as a way like to, engage back with the customers, and like what you said, we're very specific about what may be going wrong with their account.

We see you have low number of incidents acknowledged. We see that you're not deployed all the way. Things like that. So we call those red flags and we've over the years, have identified what those different things are, and it makes it very easy to have a conversation with a customer. It's not just a thing like, "hey, I think you might not be using PagerDuty right". It's more like, "here are the things that I think that you could do to improve your account, and here's the ways that we can help you get better".

Don: And so for all of you, this is a really proactive undertaking right? You're doing customer outreach driven by this understanding. How about in your case Maddie, post POC. I assume that you're still with them, working on those joint success criteria. Does it change once they're out the back end of that? Do you still engage them?

Maddie: Yeah good question, no. The engagement stays the same. So once we get through the POC period, we set you know, quarterly goals moving on forward, and we do a quarterly business review to make sure we're on track to meet those goals, and then as a team you know, we decide if we are not on track, what can we do to get you back on track? What are the blockers, is it technical, is it resource driven? Then what can we do from you know, an internal Rainforest perspective to help alleviate some of those resources from your end.

I think you know, since we're still determining what that happy path is, and what truly makes a success customer, it's really just staying really tight with your customers. Checking in even just to let them know, you know, you're still there. As I think everyone, we have a pretty technical buyer who doesn't necessarily love to chat on the phone with us, but you know, you have to learn how to communicate with your customer, and figure out the best way to get some sort of information from them.

Sometimes no news is good news, and that's great, but generally it's just trying to find that right level of communication so that you are on the same path with them, even if they're not necessarily, the chattiest of chatty, you know that they're successful.

Don: You're just not talking about the right stuff. You know, like why there's a mandate to refactor all the code. That gets them talking.

Chris I can add on to that a little bit. I think that you're really in the know on that and it's also about like some empathy there, right? Like, the customer's going through something, and what you can do is look at like really trying and infer what's going on with the customer, because they're not just going to come tell you like, "hey, we're having a rough time, or I'm just having a bad day".

But what you can do is like, see some of those activities happening, and then say, okay, they've had a lot of urgent support tickets opened. So like, that tells me they have critical problems that they keep running into, but it wasn't just one. It was like one and then another, and then another, and then you look, and maybe that's some negative feedback, and then support says, "here a satisfaction survey", and we're like, "let's look into that", and you read through the ticket and you're like, "well the support agent's like knocked it out of the park."

The customer's still really upset about something. So maybe they're just kind of taking it out on support. So we can reach out and then talk to them about like, you had a bad experience. Like, that's something to talk about. And really start to just listen to what their experience was, so that you can learn how to not only improve your product, but also maybe how to help them.

Don: Yep, great. So let's recap and then we're going to open it up to questions and hopefully we'll have lots. So conversion, retention, really hard stuff, right? Lots of competition. High user expectations. Probably getting higher all the time, right? So this listen, learn, engage framework can help you kind of focus your activities. You have to adapt any tool kit, including this one to suit the needs of your specific business, your product, where you are in your evolution, right?

I wouldn't take it literally. I'd simply take it as tools. Figure out ways to apply them. You've heard some great anecdotes, right? Of three very different companies and how they've adopted some of these characteristics you know, to apply to their business.

Q&A

Privacy Implications of Contacting Users Based on Usage

Ryan: So particularly for like beta customers, they expect us to be monitoring how they're using it. So I don't think that there's really any privacy issues there. We can also spoof user's accounts and I think that's pretty much expected today, for SaaS tools in particular, but you know, again, it depends on the type of customers you have.

I mean, if you're working with like a government agency or something, or H.I.P.A.A. or whatever, there may be different issues, but for PagerDuty specifically, customers haven't really found it any sort of invasion of privacy or anything like that.

Chris: Yeah to add, well for us at least, what add ons, customers add, we know that. We have to know it because we're charging for it, right? So this isn't like a thing but, we might reach out to them and talk to them about what add on they added, but we're not going to look at their data, right? Like even if I found out that Heroku is spying on customers I would personally get infuriated.

So that would just be like no way, but what we do, if a customer is having trouble, they open up a support ticket, they can give us permission to say, look at their code, or look at their data. So we can go in and much more proactively help them. So when they tell us that it's okay we will.

If we notice some general, like very observable from the outside patterns that look like they might be having trouble, we might reach out and say, "hey it doesn't look like everything is going so smooth over here, your application keeps failing".

We might reach out and talk to them, and at that point ask for permission. Under no circumstances are we going to violate that privacy or that trust, but there are definitely you know, things that are just like public. Not public, public but internally to our company. Like more common knowledge. What add ons have they added? What features are they actually turning on, turning off?

Maddie: And to Chris' point, you know, for our proof of concepts, they are paid proof of concepts. So our customers have invested in a trial period and so they want to see it successful and they want us to teach them how to do QA more efficiently, because QA is hard and writing tests are hard. So if we can provide a way to make that easier and more efficient for them, and utilize less of their resources, they love that feedback.

Don: So I think your question, in the end, comes down to the difference between metadata and data.

Customers absolutely expect you to be looking at the metadata, which is frequency of use, volume metrics and the like, and they absolutely don't expect you to look at the data, unless you're asking for permission to do so.

And I think if you just have that difference kind of straight in your head, you'll be on the right side of the practice.

Aligning CSM & Sales During Quarterly Review

Maddie: Great question. So our relationship with Sales is extremely important to what we do. So we as a CSM team have made the decision not to really deal with dollars and cents so we try to keep money out of the conversation as much as possible. We're truly the ally for the customer, but with that being said, everybody's a salesperson right? Everybody who touches a customer at some point is trying to find some sort of up-sell opportunity, but what we do is, as we engage with customers, we will uncover that opportunity.

We qualify it through until the point where they say, how much does that cost? Then we say, great question. I got someone who can give you that answer. And from there we bring in an AE to actually come in and have the dollars and cents conversation. We stay on the phone. We stay throughout the entire engagement. So we will always be there to be the ally of the customer. We're always on their side, but we'll bring in sales to come in and be the ally of the company and have those tough kind of conversations and negotiations.

Chris: So I totally agree with everything you're saying. I think one of the things that's really important with having the ally for the customer, the advocate, and then the salesperson. The salesperson isn't going to help with the upside but when you have the person that's the ally for the customer, that's really looking out for them and their best interest and sometimes even right sizing, then that's really important and helps build trust.

So you know, we're not trying to sell something immediately all the time. It's like, sometimes just keep them on the platform a long time and they're going to grow and they're going to spend more. So when you can build that trust, they're going to be around a lot longer and they're more apt to reach out to you if something isn't going well, if they have a good positive experience and they trust.

Maddie:

There are profiles of people who are built to serve, versus built to sell, and a good CSM team is built to serve.

And if you expect them to sell they're going to always leave dollars on the table. So bringing in a salesperson to come in and have that conversation, just is for the best interest of the company I general.

Don: So I'll weigh in for a moment if only because I have had the benefit of talking to about 500 customer success organizations in the last four years. It is a religious debate and actually, you frame the debate, which is the mission of customer success can either be a service delivery function, right? Including incentives to facilitate opportunities to sell more, or can be a revenue generating function. In which case, at a minimum, closing the renewal transaction, and at a maximum, closing the expansion transactions, are actually within their responsibility.

So that choice is the heated debate in the profession, and most organizations have to eventually decide which of those two things is kind of their north star so to speak. There are many, many organizations that are revenue driven, which means they still provide a service, but in the end, their true measures internally are their ability to sell, post purchase. I'm not taking a side on the debate only to say that it's not 90/10 in terms of a skew. It's probably 60/40, 70/30. Where the 60 to 70 is the service provider mission, and the 30 to 40 is the revenue generating mission of the sample size that I've seen.

So there's no right answer, and by the way, if it's right for a business today, it's always possible that in two to three years time, it's no longer right, in which case, you change it. So how's that for not giving a definitive answer. There is no best.

Chris: So at Heroku, very broadly we think of customer success as ourselves and they're going out and doing enterprised sales, and we rope them in, and we work closely with them, and the post sales aspect, because that's really more of the customer success things that we're doing, than post sales aspects.

But we want to be so close with those sales people, making sure that they're selling the right things because we're looking retention because we have a reccurring revenue model, right? It's really important to retain customers. So if the sales team is selling the right thing, not the most things, the right thing, then that's a win later on that makes the job easier in the post sales world.

So you can have like a little bit of a combination. There's sales people over there, and then there's like you know, support and decision architects and CSM's that are doing, really the customer success aspects and growing once the customer's landed. But if you bring them in and you sold them the world and they only needed a small chunk, they're going to be pissed off. They're going to know that they spent too much money and it doesn't matter what you do.

You're just never going to be able to kind of right that relationship.

Maddie: I can touch on that. From when we first started our CSM team it was just the two of us. So there was me and another girl, my teammate, and we divided our customer base into an onboarding phase and a renewal phase, and I took on like the onboarding, and that was just like my role, is I would go in and onboard all our new customers, and I kind of got into that zone and she handled all the renewals.

We found as our customer base grew, and as the team began to scale, that we really didn't have any type of like, real relationship with our customers because there wasn't anyone like, concrete for them to reach out to. It was kind of just the two of us and we were just like one brain if you put us together. We were like the perfect power CSM. But separate, you know, we were only as strong as the other one.

So we have found that owning the customer throughout the entire journey is really key, and you really get you know, at Rainforest we start that relationship before the customer even becomes a customer when they're still in prospect phase. But we go as far as trying to assign, if an AE needs a CSM on a, what we call the pre-close call, that CSM is their CSM once they become a customer and moving on, on it just establishes the fact that you know, we care about them.

We want to understand their pains. We don't want to have them repeat themselves to a million different people, you know, about why they're becoming a Rainforest customer. We know that from the very beginning and then we understand it come renewal time, and it just gives you really like, in depth knowledge of your customers, and then they really appreciate that.

Ryan: In terms of like pre-sale things that we've learned in terms of like, applying resources or communicating with customers ineffectively, there was a period of time where were giving a decent amount of attention to VSC or Very Small Customers, and we realized that, that really wasn't increasing conversion rates. So at some point we just basically cut that off and they're completely self-serve. So you just gotta figure out what that threshold is or if that even makes sense for your business.

Chris: Yeah, I don't have a specific anecdote that like, this thing happened and we were like, whoa, we can't do that anymore, but it's one of those lessons that learned along the way where developers don't want to be sold to, so we should talk to them and show them features and offer them features. Show them what's available and let them adopt it, and then it's the people writing the check, the management, all the way up for enterprises up through like, C-suite that are writing the check, that are going to talk to a salesperson.

They want to talk to a salesperson. So once we know what the developers are doing and then we engage at that level, we're sort of coming in from both angles right? You have the developers using the platform internally advocating for your product. I think most companies in this audience are going to have a developer base. Let them do that thing. Just make sure that they're happy so that way when you're sales team starts talking to the C-suite, then they are like, "oh, let me bring in my technical team, the technical team and he comes in and they're like, wait, you mean we get to port everything on to Heroku?" Like the deal is done right then. Like, it's sold.

The developer's coming in but they're vetting it. Like we don't need to vet this. We already use it. We just weren't telling you.

Ryan:Yeah 100%. I mean, we have a developer customer base. Land and expand is the key. You'll see one team start with PagerDuty and then that just spreads throughout the org, and eventually it gets to be big enough of a line item where then, you're going to be talking to a director, you know, C-suite level person, and it makes the decision a lot easier when you have hundreds or thousands of advocates within the company.

Chris: And it's also impressive to the management types when they're like, oh yeah, we're thinking about this project and we're like hold on, let us tell you how all the ways you're using the product right now, that you just didn't know about.

Ryan: Yeah. It varies by customer for us, but I mean like, we consistently see by cohort that the number of users in an account will consistently rise. It depends on the customer though. Some customers are more sensitive to when they want to have like procurement involved and things like that. Sometimes it's associated with how much money they can put on their credit card, and they're getting reimbursed for it. When you see procurement going, why are you spending, 10K on your personal credit card for this tool? What is it?

Chris: Yeah it's a little bit complex for us. Being part of SF.com, sometimes this requires us to go in and work with the SF.com AE's and look for customers that are sales customers are usually in the sales force and are interested, or maybe adopting some Heroku and we want to expand there. I think the critical thing that we look for is like what's the project? What's the task, the activity that they want to do to try something out?

So if it's a really large enterprise, we're not going to port all of their applications onto Heroku on day one. Very rarely. Usually what it is, is like, they have a new application, and they're like, let's put that on Heroku and make it successful. Wait until your team starts raving about it, and then the sales team will go in at renewal time and say, hey, that's going well.

That app was successful, it's growing, it's still getting traffic. What else do you have? That team's outpacing all the other teams because of the value that Heroku has with the speed of deployment and what not. Let's talk about the next project. There's not a specific number but once you get a couple of critical projects onto the platform, then you talk to the right person in the company and they're like, okay, we're going to go all in on Heroku.

And this can take anywhere from months to years. Like, we landed this past year, a really big deal that was really long in the making, just like start off with a single kind of trivial app, that the customer was using, and it was just years later, they just kept seeing more and more value in putting everything there. They finally this past year decided to go all in. They bought a really big package and are in the process of porting all of their applications over.

Ryan: Yeah, I mean there's nothing preventing you. If you know that there's a company that has a very small deployment, or a few small deployments, but you know that they're user base is much larger, that's what the account manager should be you know, researching to figure out, what are the good opportunities? Who are the next like, large companies that they can find within their customer base, and go out and proactively reaching out, or starting to talk to more teams and helping build that base of users, that really can support them in a much larger sale.

Moving POC Goalposts to Satisfy Customer

Maddie: It's a tough question, and something we face every single minute of every day. So what we do is, it's really important early on for us, is to establish who the project team is. So what we do is you know, sales obviously sells into an executive sponsor who's actually writing checks, but QA isn't particularly sexy or fun and someone is generally tasked with doing QA or there is a QA person who generally kind of falls at the bottom of a totem pole somewhere, but what we do is empower that person.

So we get it. We understand that QA isn't going to be anyone's favorite thing to do in the world, but if we can make it easier and more efficient we make that what we call our project manager look like a rock star. So he can go to his VP Engineering and say, look, I did all this. You know, I have total coverage in 30 minutes and you can deploy your code and you know it's quality.

So for us, it's all about empowering that one person so that he can go in and report back and then in terms of going to your scoping question, it's tough right? Like you kind of count on them to give you the information that you need to scope out a project, and the minute they get their hands into it, they're like, "oh no. We actually want to focus on this part of the application, or you know, we actually care about this new feature that we just released."

But the job of the CSM is to direct them back to their goal. Great, we are going to get to that, you know, once you become a full time customer, we'll be able to cover everything, but before we close this deal we set out this particular goal that you needed to see in order to like realize the success of Rainforest.

So let's really focus in there and then once you have you know, established this relationship, we will make sure that, that becomes priority number two.

Don: I'd say there's a big difference between you choosing how to spend and optimize your time with your assigned customers versus the things you might have to do to win a deal, which is a combination of your time and some product commitment. Those are very different engagement models, right? Because now you're talking about a road map, agreement, and that's when I would say you know, you want to be much more thoughtful. Because obviously there's a huge opportunity cost to building those features if they're only for that customer.

Ryan: Yeah so, at PagerDuty it's really interesting. Someone could be using our product a lot, and that's a really good thing, because they're using a lot of problems, but it could also mean that you know, they're getting annoyed because they don't have all their thresholds set properly, and they're just getting paged and woken up all the time. One of the first things that we figured out when we tried to figure what is successful.

Thankfully churn is fairly low for us, but things that we do to engage to try to prevent that is, one look at how they're using the product. Talk to them about the business problems that they're trying to solve by using PagerDuty, and trying to figure out a way to figure out how PagerDuty can help solve those problems for them. That's really the whole goal and the point of using our product, is to make their business run more efficiently.

So we tag team with the account manager, and we'll just proactively engage and try to do what we can to save the account. You know, you obviously want to try to get in there as early as possible. Our big thing that we harp on for customer success is to make them fully utilize the product as quickly as as possible across the entire organization, so that we're handling that at the beginning of the customer life cycle, versus when it's a lagging indicator and they're already likely to churn and they're already looking at competitors and things like that.

Maddie: Similar to that it's change in usage. So it's really important for us to understand what usage looks like at the beginning, and then you know, just because they're using it a lot doesn't mean they're happy, and then if all of a sudden, even if they drop to zero, or if they're not using it, maybe they're on like a regular schedule, using it a couple of times a week, and all of a sudden usage spikes, you know, maybe there's been a change in you know, where quality falls in terms of business goals. So those are our like, big red flags of we should jump on and figure out what's going on with the customer.

Chris: Yeah, I think early warning systems are really hard to put together, properly. So we're still, we've been for a while trying to really flesh that out. Things that we look for are volume of tickets. So if there isn't, that in a way is a change of usage but, if there's like a spike in tickets, maybe they're having a bad time. So we're going to investigate that. What they're saying in the tickets.

So the support team is going to, if they notice that a customer is just really angry about something, or upset about something, or just in general has been having a rough time with whatever they like, keep hitting bugs in certain feature or something, they're going to kick that up to a CSM or to a customer advocate, to do an outreach, and just kind of do a check in even if it's to give the customer some warm fuzzies like "hey, I know you've had a hard time. We're here, we care about you. We are fixing these things." It's not just a support agency and that you have another outreach with like somebody else, and then yeah, all the usage changes on like databases and stuff like that.

And for us, if a customer reports domain name away, that's game over right? Like they repoint at DNS, like that app's not on our platform anymore. Even if they haven't spun everything down. So at that point, there's not much that we're going to be able to do to save it. They've already made up their mind. So we're looking at things that are like before that, but it's really good to identify I think, some of those end game things.

Ryan: Yeah, in terms of like proactively reaching out as well, like when you're capturing feedback with CSAT and NPS, if someone talks about, "oh, I have a product issue, or I want to see this feature", it's really good to get your product managers directly engaged with those customers, and say, "hey this is actually on our roadmap. I'd like to hear what you think about this feature" and make them feel empowered to actually like shape what your product is actually going to look like, even if it doesn't happen, it just gives them warm fuzzies that you listen to them and you know, you're trying to do what's right.

Chris: Yeah and for us there's like a second kind of weird kind of attrition where. Because we're a broad bunch of applications, so most issue are they're not just going to have like a single application and then that's it. They're going to have multiple applications. When they bring one, two, three, over then all of a sudden they just stop, but we know that they're like growing and expanding. They're putting them some place else. So maybe it's too much work to port those existing apps off. They're not bringing us the new business.

So that's something we want to look out for. And we do that through just like, constant engagement. Like what's going on? Is the business growing? We should know these things just by curating a relationship. That doesn't always have to be just strictly a sales end point. It's just know who your customers are.

Don: So I would assert that the vast majority of churn happens because the customer never achieved some level of full sustained adoption, and it's much less common where they reach that level and then they somehow regress and churn as a result. It's almost always because they just never got up to that kind of the top of that value curve. So one of the ways you could think about solving for it is pay a lot of attention to internal change the customer has to go through to be successful with your product.

What is it? Do they have to redesign a business process? Integrate your software? Learn a new way of doing things? Give people skills they never had before?

Whatever the nature of the change is the customer's ability to change is probably the biggest inhibitor of adoption of your product versus you don't have the right features, or a shitty UI.

Right, so I would really work on decoding that because once they go through that change management to the point where they've kind of reached that plateau of full utilization, it's really hard to mess it up after that, in relative terms to them never getting to that point.

Chris: Yeah and what I'd add to that, especially with bigger companies, almost definitely when you get into the like enterprise, this is going to be really important because Heroku's like great right? It's super easy to deploy. You can deploy it really fast. You can deploy it multiple times a day. If you go up to an enterprise that hasn't adopted a rapid type of deployment, start talking to them about deploying multiple times a day when they deploy once a quarter, that scares the crap out of them.

So like, you have to manage this change. Even internally, it seems to you like, "it's positive, why wouldn't you want this?" It seems foreign to them and it's scary, so walking them through it is going to be you know, to your benefit.

Ryan: Sure I mean for PagerDuty, it's all about resolving problems as quickly as possible, right? Down time costs some of our customers 10's or 100's or 1,000's of dollars a minute. So the faster that they're able to resolve those issues, there's a direct ROI on that. So you look at our analytics and we can show that when you started with PagerDuty, you know, it took you this long to resolve. That accounted to this amount of dollar figure, and we can show that trending down over time and that's extremely valuable.

Maddie: So at Rainforest we deal with a lot of different sized companies. Some of them have QA some of them don't, and so for our customers where QA lives in the hands of the developers, we found that developers didn't want to have to break their workflow and go into our UI and write Rainforest tests. So we developed what we call the developer experience, where the developers could actually write tests in their own text editor. So they never have to break workflow.

And then we found that usage, they automatically adopted that. So for us, it's all about fitting into a workflow and building efficiency. So integrations are also super key to Rainforest's success.

Chris: Yeah so this is quite literally for Heroku the job of our CSA team, or Customer Service Architect, they also play the role of CSM and account management, account manager, and a few other things that's technical, roles of that nature, but they're very much going to go in and kind of white glove, hand hold the customer through what they're doing and what their process is, and a specific example that comes to mind is some number of years ago, Macy's started to have to port some of their front end onto Heroku.

It was a small project, a small piece of it at first and they did that. Then they ran it through Black Friday, Cyber Monday, all the way through Christmas, and this is a critical time for anybody doing e-commerce stuff, right? Like they do a majority of their revenue during this time frame. Up time is critical. Stability, speed, user experience, and so the CSA just worked with the customer through all of this.

We sat down with them and identified, what does success look like for you on this project? And then come to follow up with them afterwards and it was successful, and as a result, they started bringing us even more business and more of their infrastructure over the next few years. And each year was like more of the same. Like okay, here are just more critical things. Like we're going to sit down and measure this. Make sure it's successful, by the customer's terms.

Don: Really appreciate you turning out and wonderful questions. Thank you very much. I know the panelists are glad that they got to respond to all sorts of different questions about how they do their work, and I'm glad you could all come. I hope you found it of value.

You've been here a while...

Are you learning something? Share it with your friends!

Want developer focused content in your inbox?

Join our mailing list to receive the latest Library updates. After subscribing, tell us your preferences to receive only the email you want.

Thanks for subscribing, check your inbox to confirm and choose preferences!