June 21, 2017
Meteor Night: Meteor 1.5, Meteor + TextExpander, LuminPDF At Scale
Meteor Night is Meteor's flagship event. This event featured great talks from Meteor Engineers, NitroLabs, and Smile Software (TextExpander)...
I love questions. And yeah, I have a few slides, and my hope is that they provoke a conversation. But I absolutely want to share with you how we see organizations investing in customer success, from the getting-started mode to some of the key decisions that you'll face down the road, as that function matures and grows.
Mostly I'm hear to talk about best practices. So at the end I'll make one shameless pitch for my own company's product, which might enable your customer success organization, but otherwise we're really not talking about technology per se, we're talking about best practices, and an approach, and a methodology, for how this function can be birthed in your respective organizations to address the business problem that Dana's talking about, right?
Which is hang on to those hard-won customers and presumably, grow them over time.
I'll speak in general terms with respect to making a key assumption that you all sell to businesses. Beyond that, there's probably some interesting subtleties from companies that sell only online in a completely self-service model. In which case, you monetize purely by people serving themselves to other business where maybe a lot of your monetization is driven by very high-value customers, even where you might have a long tale of free users or low-tier users and the like.
Within that context I'll probably make some key assumptions that there is a customer success team that exists in your organization or you're contemplating putting in place. Okay? I see a couple right here. So as context, hopefully that'll set the stage, but first and foremost is please interact. I'd love to hear lots of questions and comments.
What we're going to cover tonight is a few topics. The first is, "What is even the purpose of having such a function as customer success? What does it exist to do?" We'll talk about Customer Success 101. "How does this team typically get birthed and why? What do they do first?" 201, which is, "What's this team going to evolve to do in time as it matures and grows up?"
And there's some interesting decision points that you'll probably face in your business, if you haven't already, with respect to how to do that. We'll talk about resources. Things you can turn to, both as a community of peers, and blogs, and frameworks, and all the different tools that help you figure out how to do customer success as a way that's specific for your business by borrowing from the best ideas from others.
I'll expose you to some of the resources that are out there. This is such a long topic, we could spend hours debating "What does customer success exist to do?" But I would assert that the first-and-foremost definition of the role is somebody who exists to ensure successful adoption. And the implicit assumption is successful adoption equals all the good outcomes, from retention to renewal, to expansion, to satisfaction, to advocacy.
It's very hard to imagine that somebody who doesn't successfully adopt your product is otherwise going to be happy and of high value to you. That might be the essence of the role or the function, the one function that has this accountability and this responsibility, specifically.
Everybody else probably cares, but this is the job.
If we think about how customer success gets birthed, usually the genesis of the role is that nobody else wants to do the job anymore. The CEO is sick of dealing with escalations. The sales team is sick of trying to hold the customer's hand post-acquisition, because it has nothing to do with their incentives. I'm looking at you.
The support organization is like triaging cases, but there's something symptomatic of a bigger problem, that it's not within their means to solve. The product team doesn't want to talk to the customer because they're busy just keeping the lights on, driving the release process, and so on and so forth.
At some point in the journey everybody throws up their hands and says, "You know what? We can't crowd source managing a customer escalation, and thus we need to hire our first professional firefighter. We need to be able to give this to somebody where this is the thing they do every single day.
That's the most common genesis of a customer success function. Now I have companies in my customer base that decided to invest in customer success before they had their first customer, which is a very enlightened way of thinking about it. But it's also atypical.
Usually things we do to part with our money, as startups, is in response to acute pain. And so when this pain gets big enough people tend to decide to staff the role. One of the first things that people, who are now professional firefighters, are trying to figure out is, "How do I even find the red flag customers in my portfolio? What are the things I can turn to to even know that somebody needs my love and my attention right now?"
Even in the context of problem solving or triaging issues, I would assert, generally you start with the effects because they're the easiest thing to measure.
Be it usage or survey responses, or support tickets, or even just retention data itself. And getting your arms around that is really important because it starts to inform where and how the problem is manifesting in your organization that led you to want to have this professional firefighter in the first place.
But those are the effects, not the causes. Usage isn't a cause. It's an effect. It's an effect of something else. And so, as organizations get better at this, they begin to turn their attention to the causes. These things are going to vary tremendously from your organization to the next.
For example, at BlueNose we run our own product internally, and after a year or more of doing that, we've completely transformed our understanding of what constitutes an at-risk account to the point where we're probably now scoring our customer's health on the basis of causal risk factors versus the effects that were easier for us to measure initially.
We started by measuring adoption of our own products, and we don't stop doing that, but now our customer health score is very much driven by risk factors that are very specific to our business, for which the next business down the block, it may not have a meaning. But, of course, there are product issues that are causal. So understanding the relationship between product capability, product usability, and the effect that it has on these outcomes, is important to establish.
Start thinking about who you're selling to and who you're targeting. What is the basis of mismatch between what your product is best at doing versus what you've actually acquired through the door?
Interestingly, just today, one of my favorite bloggers, Joel York, published a post, "It's Chaotic Dash Flow.Com," and he talked about this very issue. Which is, that his assertion is, the hallmark of a successful SaaS company is the ability to maintain a discipline as to targeting.
In other words, assure that the very best customers come through the front door to mitigate the types of issues that you're going to experience post acquisition. Because when that customer mismatches what you're best at doing, you pay for that later, versus at the moment of acquisition. And onboarding, because the first, depending on your product experience, the first day, or week, or month, or perhaps 90 days depending on the nature of the product, that's really a seminal period with respect to setting the trajectory of what's going to happen next.
Most organizations that really get really good at root-cause analysis, which is to say, "We had all these escalations, and when we unpacked them, what went wrong often will attribute the onboarding experience as the number one driver of customer dissatisfaction." And they'll start to inspect really what happened there and take a much more aggressive role with respect to orchestrating that and understanding what that pattern of goodness is so as to be able to measure how and when a customer falls off that path.
Another attribute of just doing Customer Success 101, at a basic level, is understanding it's a team sport. It's very, very unusual for the source of a customer issue to originate in the customer success team itself. Most often, if you pardon the sports metaphor, they're quarterbacking issues that originated somewhere else, for which they're engaging their colleagues to help solve those problems together.
If it's a mismatched sale, someone has to go back and figure out, "What we are we going to do to fix how we target and acquire customers, and what's the governance process there?" Or, if it was an unsuccessful onboarding, then to the extent that's a separate function, "How do we engage our colleagues to get that better?" To the extent it's an unresolved support issue, and it's been aging for a while, and it's of high severity, then, "How do we get our colleagues in customer support to deal with that?" and so on and so forth.
If it's a missing product feature or usability issue, how do we engage the product team?
Most of the issues that a customer success person is going to triage are going to have some interdependency with some other colleague.
It's inherently a team sport, and it's very often not the case that you can fix that issue by yourself when you play this role. So therefore it's really important to codify what do we do when we encounter these situations.
Because everybody else around the customer success team needs to understand what they're expected to do when this occurs, as opposed to saying, "Well, gosh, we're going to deal with this ad hoc every time it occurs." That's a lot of friction internally in the organization. You tend to drop balls as a result.
It's incumbent on a customer success team to say, "Look, our playbook for managing escalation is that we're going to triage the issue, and we're going to communicate what we think the issue is so as to involve others." And they need to know what their role is, even to the point of what are they expected to do in response to when that red flag gets raised. I can't underscore this enough.
Efficient management of customer escalations, keeping in mind that this is the firefighting mode for customer success, you're not ahead of these red flags. You're triaging them as the first firefighter or the first generation of firefighters. Really important to communicate and codify how your colleagues are going to support you when these situations arise.
Define that playbook, get your CEO to buy off on it and hold people accountable to their participation when it occurs. Really important discipline to establish early.
Alright, so let's move on to 201. You have successfully hired a firefighting team, from one heroic person to, perhaps, an entire team of individuals. And the complaint that they're going to start talking about is, "We just can never get ahead of this reactive mode. We need to get to proactive mode," and so you start thinking about, "Well, what is it they can do to better sense these moments of customer opportunity, to either eliminate these issues or thwart them? Or, conversely, just provide a fundamentally happier path for the customer to be onboarded, and adopt the product.
Can't reinforce enough the single biggest opportunity to prevent customer escalations down the road is to master that onboarding and first-use experience. And it's going to vary tremendously based on the products that you sell, which could be literally from minutes to days and weeks, owing to your specific product and how a customer gets to first-use and full adoption. Huge payback to do that well.
I built a big data team for a company that had 120 million users, of which 90% were free and 10% were paid. At the time it was one of the largest freemium businesses in the world. And it was software that you would install on a laptop. We looked at the data with respect to how and when people abandoned the installation process from the moment they click the download button on a website, through to moving the bits around the world over a CDN, to beginning to install it on a laptop.
Ultimately that was done when that piece of software phoned home and said, "Hey, I'm installed. Are there any updates I can get now?" Which was potentially many minutes subsequent to hitting the green button, and we actually found meaningful percentages of users that would abandon that process along the way, and we removed one of the installation steps of seven and the success rate went up measurably, like by 10% or more.
Then we skinnied the software package down by about 20%, meaning fewer bits to deliver to that user, so they spent less time waiting. And we got another 10% or 15% success rate. So think about that.
At the top of our funnel, we got 25% more users to complete an installation. And there was no monetization until that occurred. That onboarding experience just lasted a few short minutes, but paying close attention to how to optimize that, for us, yielded millions of dollars in revenue given the scale at which that business was operating.
So next thing to think about is, "What are the signals that you can listen for?"
The definition of signal is, it's not an event.
In other words, it's something that you can listen for that you're probably getting continuously from each and every user or customer account. Which probably means, first and foremost, usage and very few other types of data will form a signal in the sense that you're getting this continuous thing.
In contrast to a signal, think about a survey. You might run surveys from time to time, and the response rate's very low, and the result of which is, yes, you get a survey, and it's actionable, but it's not a signal or support tickets. Some people are very happy and never engage your support function, in which case when you turn to support tickets and you see no tickets, there's no signal.
So there's lots of false positives and false negatives in that data. Yes, you absolutely want to pay attention to a support ticket with respect to the effect it could have on a relationship. But it's an event, not a signal.
Probably usage is this primary signal for which you want to constantly be listening and understanding. How is it varying over time related to the onboarding experience?
If the onboarding experience is the best opportunity to get to Success 201, then the next best opportunity is probably monitor very closely what's meant to happen next. What is this usage pattern expected to be in that first week, month, 90 days? Because if it's growing, and that customer reaches some sustained level of adoption, it's going to be really, really hard for them to abandon your product thereafter.
And so, in many respects, in that early period of adoption, you're looking for pretty strong usage growth. But you're going to need to understand, "What is that pattern for you?" in order to measure any customer against that blueprint.
Then finally there are, absolutely, events pertinent to any given customer that are actionable. A bad survey is a reason to engage. By the way, a great survey is a reason to engage because that's potentially someone who's going to be a reference or a case study, or an advocate. But it's just an event.
Similarly, you want to think about support tickets as an event and, potentially, M&A, changes in people's job title, as you begin to monetize your accounts and you get from a single small paying developer to an entire organization. Somebody in a leadership role in that customer organization might leave, in which case, you may be faced with the challenge of reselling the deal, and so that's an event, not a signal. But that event can be very meaningful for you potentially.
As you work through this framework in terms of how you look at data and the role it plays to help you understand your customers, I would constantly encourage you to think about things that are signals versus things that are events, both of which are highly actionable, but there is a difference.
One of the most common challenges that we see is getting beyond the one or two, or three professional firefighters to defining the long-term purpose of the customer success function particularly as it evolves and grows.
The fork in the road typically is defining its purpose as either being a revenue-generating function or a service-delivering function.
The way to distinguish between the two, if you're wondering, is the organization primarily incented to deliver revenue as an outcome, including to the point of consummating a transaction? That probably means they're revenue driven. In which case they sit alongside acquisition revenue generating actors, but they're very commercially focused.
Whereas, a service-driven organization might still be measured by key metrics, such as retention and renewal, and maybe even have incentive compensation tied to the ability to reach those metrics. But the primary mindset is not to conduct a transaction so much as to deliver a service, and the purpose of the service is to remove the barriers and inhibitors to successful adoption.
Look closely at the incentives that the customer success organization is being given and to the extent that it's not clear as to which of these two roads they're going down. It's time to have a fresh look at those incentives because it creates two very different types of organizations. Neither is good or bad, by the way. They're just dependent on the needs for your business.
But some organizations, by not clarifying the purpose, conflate lots of things. In which case the people in customer success aren't really clear as to their mission.
In many respects this decision isn't for the customer success team to make for itself, so much as it is for somebody in a leadership role to decide with and for them.
I know this one's going to be near and dear to your hearts, because I'm pretty sure all of you operate businesses where somebody can start with something free or at very, very low cost and scale, and, with success, grow into a very, very large customer operating at very high scale.
And so one of the challenges for a customer success organization is to start thinking about tiering customers and defining what is the service experience that accompanies each tier.
Not from a contractual obligation point of view so much as, "What is it we're going to invest in that customer segment that's different?"
The extreme example would be you may have a million-dollar-a-month customer operating at tremendous scale and another customer who's operating with a developer license. It's free, and they're just using the product in some test mode. And so would you give them the same level of service? Well, of course not. So tiering out your customer base is an incredibly important milestone that typically confronts a success organization some time in the one-to-two-year, post-launch timescale.
Thanks to growth, you can no longer service everybody with that same white glove service. It just doesn't scale in terms of resources and staff capacity, so you have to start taking the hard decisions as to what accompanies each.
There's a lot of talk in the customer success community about this right now because it's a natural evolution. Again, if you think about the desire to hire a firefighter, most often the justification for doing so is to protect the revenue that comes from tier one customers. So we can't lose them because they're so valuable to us. Therefore, that's where the firefighters get to live.
Then you start realizing, "Well, gosh, we still have these lower-tier customers, and they're very valuable to us. In fact, most high-value customers came up through that journey, so let's do something to make sure they continue on that journey."
Somebody at Zendesk told me that their largest customer, which is comprised of 6500 seats to Zendesk, started with five. I'm like, "Whoa, that makes it probably really important to think of this as a customer journey that takes into account very pleasing service experiences at each tier."
But in all likelihood, the degree of human touch is going to be very high in the highest tier, and very low in the lowest, and some mix between campaigns and programmatic touches in the middle tiers. Plus people in certain circumstances. That's going to cause profound internal debate to the extent that you're sensing the need to have that conversation in your organizations.
I would argue that it's not like wine. This decision does not age gracefully. It's really important to get ahead of this and drive alignment. There is no doubt that every C-Suite member of your organization is going to weigh in with an opinion. And that's really hard to get to consensus, to take a decision as to "what this means," but the alternative of not deciding is far worse.
There's no perfect structure, but to simply have chaos prevail, where your very high-value customers get the same service as low, is a really inefficient way to operate a business, and, unfortunately, there will be high-value customers who get neglected to the extent that they don't receive that prioritized treatment.
You'll know that your customer-success function is growing up when you feel the tension to address this question because it's a mirror of the customer base that you've succeeded in growing and creating.
Let's imagine that you got good at Customer Success 101. You graduated to 201, and you ticked all those boxes. So what's the case to invest? What are the types of benefit drivers that you can start to lay claim to by having this function staffed? There's obviously the clear monetary benefits of renewals and expansion and upsell.
Interestingly, the thing that I've seen CEOs get most excited about is to look at the customer base as an asset that can be used to affect customer acquisition. To the extent your CEO is very sales minded, they'll probably find that to hold very high appeal, which is, "How do we make our customers so happy that they become advocates which turbo charges our acquisition program?" Because advocacy, I believe, based on studies, yields four-to-five "X" the sales revenue as any other alternative channel to acquire a customer.
In some organizations that really excites executives to invest far beyond the revenue, because perhaps you don't even have a churn issue, or it's not a burning platform. In which case, there's still a case to invest in customer success to unlock that advocacy driver.
Think through the types of benefits that really motivate your organization that you, in customer success, can lay claim to. Because it doesn't have to be strictly measured as well. We're the folks that prevent churn.
In fact, that's probably not rewarding for anybody, because in the end, that feels like a very calculated financial decision almost like writing an insurance policy. And there's so much more opportunity in this function to deliver beyond satisfying that as a minimum. There's lots of resources out there. One thing that we know for certain is anybody in customer success is very eager to network with their peers because this discipline, this profession, is so new. And there's so much variety for how you approach it that people are very hungry to connect to others.
My company, BlueNose, has done a lot to try to play a role in facilitating that. We publish a lot of blogs, none of which are about our product per se. They're more about best practices. We have guest bloggers. We have an online community, e-books, frameworks, white papers and so forth.
I very much encourage you to go to our website and take full advantage of those resources. It's a very robust collection of stuff. There's lots of bloggers that write about customer retention and customer success, either very directly, or indirectly, sometimes through a strict financial lens, and in other cases, much more so around the function of customer success and the role it should play in the organization.
Here's some examples, and they all have a very different take, so I think they're each well worth reading for their own reasons. And there is a local customer success meetup, so if you go to Meetup.com and you look for customer success as the title, there is one that happens monthly here in San Francisco, and it's primarily comprised of young-to-midsize SaaS and infrastructure startups.
We spent a lot of time trying to put this into a strategic framework, so again, this is nothing about the product we sell, but much more a way for somebody, in their leadership role of customer success to start to help their organization, think through the capabilities that they're going to build in time.
Like Customer Success 101, it implies a journey and as much as you're going to start on the left, typically building the foundation. But ultimately, these 11 pillars are the things that a customer success can and should grow into being capable of, even though that might be a multi-year journey.
We'll often work with our customers and they'll self assess themselves on the basis of this framework. They might say, "Well, we're kind of good at cross sell and upsell, but we'd like to get better." OK, well, that's going to inform the next year's strategy in terms of whatever gets invested, which could be people, systems, initiatives, whatever.
So think of it as a way to distill a road map for your organization, but to always be thinking about building these as capabilities somewhere in the organization either inside that customer-success department, or in other codependent teams, because if you're going to kind of maximize the potential of your customer base, you'll ultimately want to get good at all 11 of these, but in time.
We've written extensively about this framework. It's a really good e-book. Again, nothing about product, all about best practice and strategy, and this gets a lot of attention in the C-Suite in terms of helping executives understand, "What is the opportunity to invest?" and how to think about sequencing it.
I do have one shameless product plug, which is conceptually what my company's product does. Three very simple things, we pull in customer-related data from all the silos in which it lives, be it usage, support tickets, billing information, survey responses, CRM, anything you could imagine. We populate it into a customer data warehouse or customer 360 as the case may be. We provide a lot of analytics to try to make sense of the data from very basic stuff like metrics, which still have great business value, to a health scoring engine to even predictive algorithms.
In the end, all of this data and all of these insights are intended to cause customer engagement to happen in that more proactive manner. Both for the early warning signs that would cause somebody in customer success to know they need to reach out to that customer now, versus waiting weeks or months later when that renewal conversation is meant to happen, or they just churn away because they're on some monthly plan, as well as for the more proactive interactions around capturing upsell opportunities and the like.
In our world, we're really trying to affect customer engagement in two domains. One is mobilizing the people that do customer success so their daily work is much more purposeful and prioritized alongside thinking about campaigns. So what can we use this data for that would cause us to engage an individual programmatically? A developer's got a new account, but hasn't logged in and done anything in seven days, great.
Well, there's a signal available that could cause them to get a specific type of nurture email, which might be very different from the developer who has ripped through that whole provisioning and onboarding process and is now using your service to its full extent. In which case, there's a monetization opportunity, or a referral or advocacy opportunity, in which case, they could become part of such a campaign.
We integrate with email delivery platforms to make campaign tools much smarter by bringing this data to bear for the sake of targeting or some combination of the two. This is really, for us, the ingredients for scale that helps to solve that multi-tier customer problem where it's a people-led engagement model for high-value customers and a very programmatic-led engagement model for some lower-value customers and/or some mix in between.
Conceptually, that's what our product does. Those are some representative logos of customers of ours, so we actually know how to work with SaaS companies. Alright, that's it, thanks for the opportunity. I'd love to take more questions or comments if you have them.
The question was these types of causal measures might actually pre-date usage altogether, which is exactly right. The onboarding experience, depending on the nature of your software, might be training, integration, setup, lots of things.
There's a lot of variety based on the product in question, but, yeah, that's a really, really critical window of opportunity to get it right that has a profound effect on what happens next, including adoption or not as the case may be.
Yes and no. So let me distinguish between what I consider to be a customer-specific signal versus a signal about your portfolio. If you looked at the sum of all tweets that pertain to your product, to the extent there's any volume, you're probably going to get a directional read and how happy your customers are in the whole.
You could say the same about surveys, by the way, or support tickets or any of these other things. If you're trying to double click and then assess the health and satisfaction of any individual customer at a moment in time, none of those types of data form a signal because they're stochastic in terms of when it happens.
I'm talking about signals that give you a means to assess the health of each and every customer, which probably usage can serve that in the way that other data cannot.
If we take it up a level to think about the customer portfolio, usage plus lots of other things can form a portfolio-level signal. But in the context of customer success and the role that they play, it's most often about owning a set of customer relationships and trying to control the outcome.
In which case, they may or may not be thinking so much about the portfolio as they are, "How happy are my assigned accounts, and can I deliver on the retention and growth numbers that pertain to those that are assigned to me?" be it a segment or, even in many cases, a discrete list of assigned accounts that might number from 10 to 50 to 100.
Like anything as an organization gets larger, specialization begins to set in, and so you could take, literally, hire number one, who is the professional firefighter, and you fast forward seven years later. And that is now seven different jobs, from the renewal sales specialist, to the expansion sales specialist, to the customer-success person who's focused solely on adoption, to the onboarding team, to the voice of customer and advocacy program manager to...
So as your organization grows, you'll probably start cleaving it into distinct roles. But for startup companies of varying degrees of size, this tends to be the first decision point they confront. And probably it's many years later before they begin to add even further specialization beyond that.
So, yes, it's pretty typical to the extent that you staff a revenue-generating role to still have a service-orientated actor, although this choice tends to predominate how and where the staffing and organizational attention goes. So I do frame it as a choice.
So the question was Appcelerator and Kissmetrics, specifically, and in both cases it is a service delivery. Now, customer success is a great source of leads and revenue opportunities because they're interacting with customers, and they're inspecting data, and in those interactions, in that data, are all the clues to expansion revenue. But in both cases, somebody else is introduced to the customer to pursue those opportunities. And I'd say that's probably the dominant model, greater than 60% of all customer success teams function in that way.
They might have some variable comp related to the retention rate, but their ownership of the transaction and/or expansion revenue is sitting in a sister function, probably back in sales.
The question is to the extent that you have two actors, one who is service driven and the other is more sales orientated, isn't that an awkward customer interaction? Arguably so. I mean, every hand-off between any relationship owner and the next is an opportunity to get it right or not. So, yes, it is a risk. Conversely, somebody in a service role who has that mindset is probably a pretty awful negotiator and commercially-minded individual that is going to be not awfully effective in terms of driving closure of a deal including expansion revenue.
It comes down to who that person is innately and what's a reasonable boundary for what they're expected to do. And our observation is people who are really driven by delivering great service to a customer are often unwilling or incapable of engaging in commercial types of conversations with the customers.
So that's not a good customer interaction either. I guess my first recommendation is get the distinction between the two roles clear, or to the extent you're purposefully commingling them today, know under what circumstances would they be cleaved apart.
Typically customer support is responsible for triaging inbound issues and solving them one at a time, and that's the expectation of the customer to the extent they're in there filing a Zendesk ticket. They want it to be resolved quickly and satisfactorily. It's almost a transaction.
Customer success, to the extent it's a separate role, is generally trying to solve issues that are beyond the realm of an individual ticket. Tickets might be symptomatic of a problem, but they're trying to pull it up to a higher level and understand, "What is it going to take for this customer to succeed with my product? What do we need to do to enable that?" which could include resolving support tickets, but typically is something beyond that.
So first step is, if it's a combined role, know when and how it would be separate and to the extent it's separate. It's probably the support organization feeding the success organization with clues as to who they should prioritize their engagement with, where it seems like it's beyond our ability to make that customer happy by just closing this ticket out. Beyond that, there's a lot of negotiation to be had. But I think that's probably a good way to start.
When the organization is just not able to scale and do it the old way, I mean, literally, it's a lot of changes. It's almost like you get to this point of it's all pent up, and all of a sudden somebody says, "Well, maybe we should hire that person." "Yes, God!" and, you know, "I'm so glad you said that! I'm sick of dealing with this stuff!" right? It's like, "I want to get to my other job."
The tension kind of mounts, and somebody's got to name the issue. But I suspect you'll feel it when it's right, because probably the best symptom is people are not capable of delivering the other things they're accountable for, that form the nucleus of their job, and when you start finding yourself detracting from the ability to do those other things, that's probably a good clue that it's time to bring a person in to own it as a role. It's really subjective. It's typically when the organization reaches that breaking point.
Yeah, I think that onboarding experience, and however it is you script it out to be the ideal journey, there's tremendous opportunities to automate that. The welcome email, the touch in three days time.
What we'll often see organizations do is be tempted to wire all those touches into the system itself. In other words, just harden the logic of who's going to get an email and why. Oh, great, we've provisioned a new account. Then they're automatically going to get an email welcoming them to the service, and they're automatically going to get a three day touch.
The challenge becomes, really beyond the welcome email, you're going to start to see customers' behaviors vary. In which case, it's really hard to codify, inside your own product, all the logic that says, "Oh, well in three days time, if they're active, then they're going to get on this path and get this message, and if they're inactive, they're going to go down that path." And the permutations begin to grow in time.
A lot of organizations start by wiring in those early touches that are related to your onboarding sequence, and then they'll realize the variability in the customer base post-seven-days or post-two-weeks starts to really go up. In which case, it outstrips the ability for the product to keep pace with that. So I would absolutely think about programmatic touches as it pertains to the onboarding experience, and first-use experience, and defining the journeys of what happens, when it goes perfectly, and what are those touches, and what happens when we're managing to an exception in the sense of, "Alright, someone's fallen off that path."
What's the intervention? What is that email that's going to try to engage them? Potentially the email may be to offer them a person-based resource to help them get back on that path, like, "Hey, we've noticed you haven't... Dana's standing by ready to help you, and here is her email address, or book an appointment with her."
In some respects, the thing you have to worry least about is the people who stay on the desired journey. The thing you have to worry about most is the people who begin to fall off of that journey and what are those interventions going to look like.
How do you give people goals related to baselining retention numbers that pertain to their job? My philosophy is that goals need to drive individual accountability, but there may be a team baseline that you establish against which people are measured.
Now, if you have tiers you may have different churn rates that are acceptable in lower tiers versus high. Typically the higher churn rates occur in your lower tiers, and when people have invested a lot in your product, they tend to stick around. In which case, that's a subtlety.
I think your best partner in this regard is somebody who is playing a role of sales ops or, potentially, financial planning and analysis if your team has gotten that big to have somebody in a finance role. The reason I bring that up is, let's work back from the board pack, "What's going to get reported out as key board metrics? Who actually understands what those metrics are and understands what the appropriate thresholds are?"
It's very likely that key metrics, like churn and retention and conversion rates, are already bidding reported to the extent you've got a board now. Maybe you don't because you're younger. Everybody seems to have a board these days. And whoever is going to do that reporting is probably the person you want to work with to set the reasonable threshold.Because they're probably the place where policy is getting set in terms of communicating to the board what our objective is for an acceptable, healthy retention rate versus not.
So have I ever seen a customer success team without a variable component? No. No. Inasmuch as they're always measured by a metric, and the variability we see is how much variable compensation is associated with achievement of that metric. And, in some cases, it could be really low. In some cases it could be quite high. But without that, it's really hard to measure success of the role, because that is the thing that they exist to affect.
How do we get a sales team to care about customer retention in the absence of a customer success team? It's probably the cynic in me, and my answer is probably not. I used to be in sales. It's just antithetical to what they're going to optimize for.
There's one truism I believe in, related to sales people, which is they will optimize for the incentives they're being provided. So, yes, conceivably, if you have incentives that are meaningful enough financially that would induce them to care for existing customers, then they will start following that behavior. It's really hard for salespeople to exist in those two worlds, though, of being driven by acquisition, and compensated in that way, and simultaneously balance that with servicing existing customers so as to affect retention.
My opinion is, that's a special form of schizophrenia that you're introducing to them.
So I would avoid doing that, yeah. I would find some other way to provide the service than via the sales organization when their incentives are otherwise to optimize for customer acquisition. Because that's what they want to follow.
Yeah, I mean, clawbacks are absolutely a management tool. They can fulfill a purpose. I think I would recommend thinking about clawbacks when your customer acquisition costs are extremely high. Because if that CAC is super high, and you find yourself with a mismatched deal, and now you're triaging the crap out of it in the service delivery function, or even post-implementation, the solution to a customer who's been sold at that price point is often really, really expensive, almost at the road map level implications.
In which case, clawbacks can become a useful tool. So I would say they're probably proportionate to ACV or first purchase value, in which case, you might consider using them when you've got a really expensive CAC. When your CAC is low, and first purchase dollars are low, it's not such a meaningful tool because the implications are so low. It's about a wrist slap and not much more. Like, I'm taking 27 dollars of your pay back. Nobody cares.
It's been great. Thank you for all the great questions. I appreciate the time and the opportunity to brief you.