Mar 20, 2020

[Podcast] Setting up a customer health score that means something

Brook Perry By Brook Perry
Director of Marketing at 'nuffsaid
gain-grow-retain-podcast

For executives who are inherently short on time, it makes sense to to try and drive toward business metrics that point to where action needs to be taken. 

Customer health scores are an example of that. Executives want to quickly know where there’s risk across their entire customer portfolio, and customer health scores are how they do that. 

 

The problem is, customer health scores are often made up of inputs like product usage, which aren’t necessarily indicative of risk. (If your customer isn’t using the product, that is a sign of risk, but if they are using the product that doesn’t mean they’re “green.”) These variables aren’t enough to truly detect risk or prescribe targeted solutions. 

 

Nuffsaid’s CEO, Chris Hicken, recently joined the Gain, Grow, Retain podcast to talk about customer health scores. He explains his thinking on why customer health scores often aren’t sufficient, and Success leaders can “feed” those scores data that makes those scores much more focused and actionable.

 

Listen to the full episode here, or read the lightly edited transcript below.

 

Rectangle-1

 

 

Jeff: Hello and welcome back to another episode of Gain, Grow, Retain. We have Chris Hicken with us today who is the co-founder and CEO of Nuffsaid, which helps to centralize the world's work apps and focuses people on the work that matters. And we got into a deep discussion with Chris around customer success and how people prioritize their time in that field, how we think about the technology that they have in terms of some AI and machine learning that they're building into Nuffsaid. And then the bulk of the conversation centered around the idea of health indicators, and how we're thinking about trying to make those as proactive as we can and what goes into that. So I hope you all enjoy.

 

Chris: So Nuffsaid started as an idea when I was working at UserTesting Enterprise Software Company that helps people build great customer experiences. I joined the company really early, I was employee number five. And what I saw over the course of my almost eight years at the company was that people were increasingly overloaded with information, distracted, and these were coming from all of our communication tools, email, chat, SMS, LinkedIn, weekly reports, every SAS platform wanted our attention. And what I found was, increasingly, people, well, they're spending more time at work. Some people were spending 12, 13, 14 hours a day at work, but especially towards the end of my time there, I felt like people were getting less done in their job than they ever had.

 

And this problem is going to continue to get worse as more investment dollars are going into software, more different communication channels are coming into our worldview. It's harder to spend time focusing. And so the idea for Nuffsaid and our vision for the future is that we create a brain that sits alongside you at work, and that's an AI-powered brain. And that brain, one, it filters out all of the noise and distractions of your day, but more importantly, it helps you focus on work that matters for your job and your position. So, it's not just about communication that matters, but it's tasks that you can do today to move the needle for your job or for your department.

 

So that's what started the whole concept of Nuffsaid. That's a big vision for a company. And so what we're doing is, we are building the AI, these brains, department by department. So we're building a brain, initially, for the customer success/revenue retention group within a company. But then we're going to develop one for sales, product management, engineering, marketing, et cetera.

 

Jeff: That's really awesome. So the idea then, too, is that you are sitting as kind of the sidecar to myself, whoever the customer success person is. As you're sitting alongside of that, then you essentially all the inputs from all these areas, and you're essentially helping me prioritize my day, think about the things that have to respond to, and really just helping me be most effective with my time at the office.

 

Chris: That's exactly right. Every day you're going to leave the office having accomplished the most important things you could have done each day.

 

Jeff: I like the vision. Well, I know as we were going back and forth, and somewhere, I think naturally that goes alongside of this is, in our world, and a lot of times in the customer success realm, is thinking about health scores and prioritization of accounts, and are we actually driving value for our customer? And knowing that a customer success individual contributor could have something from 100 customers, up to 500, up to 1,000.

 

The idea of a health score has become really prominent in the industry, and I think what we continue to see from our side of things is that they're trying to automate health scores. They're trying to get a lot of things crammed into it, so "I need relationships, I need product, I need feedback surveys," I need all these different things getting into that. So what are your initial thoughts? I don't know if you really feel strongly about an account health score or what you've seen, but what your initial thoughts just around trying to build some sort of score or metric that really helps somebody prioritize their accounts from the customer success side of things?

 

Chris: Well, at a high level I agree, especially if you're an executive. I understand the drive to simplify the overall portfolio health and get to a number or a metric that helps you understand where there's risk. So, I understand why people are driving towards that. The biggest problem with health scores, and I would actually go beyond health scores and say the biggest problem with how customer success is managed today, is that almost all decisions are driven off of lagging indicators or vanity metrics.

“The biggest problem with how customer success is managed today, is that almost all decisions are driven off of lagging indicators or vanity metrics.”

Typically, the four metrics that most companies are driving decisions off of our usage, retention rates, and by retention rates this could be net or gross retention rates, customer advocacy, which often comes in the form of NPS or referrals, or goals. "Have I added value to my customer, have I helped my customer achieve the goals that they set out to achieve?" All of those metrics can take 6 months, 9 months, 12 months to come to fruition. And in the meantime there's tons of data that the company can be collecting about that relationship with the customer to understand where there's risk in the portfolio.

 

So my take on health scores overall is that, in general, health scores are driven by usage data. Usage data is not a helpful metric in discovering whether or not your account has any risk. And we can go into specific examples of that in a moment if you'd like. And the result is that health scores end up... The VP of customer success or the chief customer officer is not able to leverage the health score today to drive improvement in the portfolio and reduce risk overall.

 

Jay: Let's go back to the idea of even just this whole notion of calling it a health score. That makes it sound like it's a single number. I really like to think about it as a set of indicators, health indicators, even like key health indicators, you could think of it as. Because the reality is we've seen so many of our clients and people in the marketplace just trying to build a score that tells you is the account healthy or not, and then nobody understands what it means.

 

There's two things I think that are important. One is being able to tell if you've got a problem, basically, predict whether you have a renewal risk. And then two is what are you going to do about it. And if the score is just some black magic number, then what are you going to do about that? You don't even know where to start looking. So we tend to think about it as a set of health indicators, I guess, so I'm wondering if you agree with that.

 

Chris: I wholeheartedly agree with that. And the buckets that I would use to... the data that I'd want to gather during the course of the relationship that are kind of more of the indicators are the different buckets of risk are around the customer's maturity, the product, the people, and the pricing. That's how I bucketed them. I've seen them bucketed in different ways. And I think there are questions that you want to answer under each of those four buckets to discover where the true risk is in the portfolio.

 

And to your point, Jay, it's not enough to detect risk. You also have to be able to trigger actions, meaningful actions, that a CSM can take to reduce risk. So these could be company-approved actions, these could be industry best practice actions, but in order to reduce risk there has to be a combination of detect risk, trigger an action. And I think that's kind of our philosophy on this kind of AI that we're building for CS. It has to be able to do both things.

 

Jeff: Yeah, absolutely. And that's something that we've seen, I think as well as you, we've seen some customers that have over engineered the score so much that there's literally confusion among the CSM team about like, "I don't even know what this word means. I don't know where it comes from and therefore I have no idea like what I should be even doing with the customer," which ended up itself as like the opposite of what you wanted in the first place. So it's just funny how they don't use it manifests itself.

 

Chris: The reality is, it might be overreaching to say this, but most customer success teams are not using their health score re effectively and regularly as a part of managing their business anymore. And, by the way, it's not because the health score tools are bad, it's just that we're not feeding the health score tools the right data, a complete set of data, that's needed to generate an accurate view of the overall customer risk.

 

Jay: Totally agree. I want to dig into your category for a second because the one that really stick out to me is pricing. I'm intrigued. Tell us a little bit more you think about that.

 

Chris: There's lots of different factors that go into pricing and I think the first question that you have to ask when going after pricing is, "Does the problem solve a severe and ongoing problem for the customer?" Usually you can connect to your product to some kind of problem that the customer's experiencing. But you need to understand how severe the customer thinks about that problem. Because you can't really come up with a price until you can answer that question effectively. So pricing starts with answering that question. By the way, that question is difficult because a lot of times customers aren't willing to give you an indication of how severe the problem is because they want the leverage to be able to negotiate price in the future.

 

But it definitely starts with understanding that question. And then the second question is, "Is the problem that's being solved by the product severe enough to justify the price?" So you need to get an understanding of a price banding. If someone rates the problem that they're experiencing a 5 out of 10, what does that problem mean for you in terms of your ability to price that product for them in the future and your ability to grow your ARR with that customer going forward? And then there are other kind of factors and pricing too. For example, discounting is a big one. And here's what I mean by that.

 

I'm going to go back to an example at UserTesting. We found that customers who are unwilling to invest a certain amount every year to install UserTesting probably weren't thinking about UserTesting as an ongoing product that they wanted to use. They're probably thinking about it more as a single project. Like, "Hey, we're releasing a website, we want to do some UserTesting. So help me solve some short term pain, and I don't want to pay a lot to solve that short term pain." So what would happen in the sales cycle is salespeople wanted to get deals done so they would offer discounts. And when we got below a certain threshold, we found that we were dipping into a clear indicator from the market that that customer was unwilling to invest in UserTesting as a longterm solution for their business. So I think discounting, there are actually three or four different vectors that you have to look at discounting to determine risk in the portfolio. But that's one of them.

 

And then the last one on pricing is around alternatives. So it's important for you to understand one, where's competitive pressure coming from? So not only which companies are you feeling pressure from, but what types of pressure are you feeling? Are you feeling pricing pressure? Are you feeling a feature set, total product pressure. So you need to understand what aspects of alternatives are creating price pressure for the customer success team. So those are kind of the four main items. And I think you can collect the data to answer these questions through analyzing communication that's going back and forth with the customer.

 

That's actually a big way to collect that data. I think you can do some clever surveying of customers throughout the course of the relationship, especially earlier when they're not thinking about renewal yet. You can kind of tease out some of how they're thinking about the problem set. Some of it will happen during quarterly business reviews (QBRs). Some of that you'll be able to get during onboarding, but they're kind of four maybe five different ways that you'll be able to answer these questions over the course of your relationship with them.

 

Jeff: So two things that really come to mind for me as you talked through that. One is, I like the way that you think about trying to gather those points of feedback at varying different areas. Because I think that, as we think about the customer journey, not only do we want to think about it from the customer lens, so how is the customer actually going through this, this journey and the flow, and how are they going to perceive handoffs? How are they going to perceive these communications? But I think an underlying element that's really missed, at least from what we've seen a lot of our work is the those points in time that we want to gather feedback from the customer and we feel like it's important. And I think a ton of companies have missed that from our perspective. After onboarding, it'd be great to understand how did our onboarding and implementation go? How was the configuration process? Was our team attentive and do we feel like it's the right process for them to go through?

 

And same thing at varying points. Like you said, even talking about pricing. The second thing that comes to mind is we did a really cool project a couple years ago for a company around pricing and their willingness to pay survey and really tried to align it. So they actually had a very similar challenge to what you described where they had a product that could be used on an ongoing basis for some large enterprise type clients. They also had a product that could be used by a small mom and pop, one time, for one project, and they allowed people to move in and out of their contracts on a monthly basis. So you'd see revenue spike and declined a lot in monthly cycles.

 

Literally, if I was their CFO, I would probably have a heart attack because you'd see a ton go up and go down, and you had no consistency. And so part of our project was to try and figure out what are those thresholds that people are willing to pay for certain features and for certain outcomes that they're looking to achieve. Actually talking to customers themselves about how they were using the product to actually deliver an outcome and what they were trying to achieve. So that was the second thing that comes to mind.

 

For us, the work that we did, I thought it was really cool project because you actually got to see these varying points of feedback come together, and really how the pricing mattered at the end of the day in order to drive the right packaging and deliver that to the customer appropriately. And then getting the sales team to buy into that. You can't sell this customer this type of package or product. So there's other things. Jay, I'm curious what you just thought of that too.

 

Jay: It is the most valuable project we've ever done, but we're not a pricing firm. But it had such a huge impact on customer success, we felt compelled to do it in that case. We'll probably never do another one. Although the ROI was so huge for that company looking back 18 months, we probably should do more of that work, but it's not just the price point, it's the structure of the agreements and how they factor into it with the rest of the marketplace is doing around that type of solution. So I like your categorization. I haven't ever thought about it, of putting all those things in a pricing category. 

 

Chris: Yeah. It is. It's a product market fit, in a lot of ways, question. The pain that's being solved, how severe the pain is, what you're willing to spend to solve that pain. And the way that you ask the question, and I think you guys both brought this up, the way that you asked the question and the timing of asking the questions is very important because if you ask it the wrong time, if you ask that question too close to renewal, the customer will not give you honest feedback because they want to have pricing leverage come negotiation time. And so how you ask the question, when you asked the question, that's really important to get accurate data that you could take action on in the future. And the fact that you're doing, as a customer journey, customer success focused firm, the fact that you guys can do pricing as well is absolutely huge.

 

I think it's one of the few things that customer success teams can do to dramatically improve their results. And it's one of the last things that companies actually pay attention to because pricing is almost always created by the marketing team, driven largely by very vocal salespeople. And the salespeople are not thinking about what happens post sale once the product is being adopted and the levers that you can use to drive increased adoption and revenue growth within the team. So I guess what I'm saying is, I wholeheartedly agree with you that it's a huge opportunity, and it's awesome that you provide that service.

 

Jeff: You mentioned earlier just about usage and how it can largely be a lagging indicator. And also really you're measuring usage, you're associating more about logins and where they're using the tool and not necessarily what they're achieving being in the value that they're getting in terms of solving the problem. So right now, what are some of the categories you think of and you have as in terms of usage, and then how do you try and make that a category that is leading if it can be?

 

Chris: Well I don't, I don't think it can be leading, but the standard usage metrics, I think, are good, which is breadth, depth and frequency of usage. I think all companies have some flavors of those three categories when they're measuring usage. Here's the problem with usage, with respect to detecting health risk. And I found this to be true for most company I've spoken to, if the customer is not using the product, obviously you've got a problem. So red alert, figure out what went wrong with that customer. If the customer is using the product, who knows whether or not they're going to renew. Nick Mehta at Gainsight wrote an article in October last year kind of with a similar sentiment. And here's an example of what I'm talking about. This happened to us at UserTesting, especially in the early days, less so more recently.

 

We would have customers that were massive evangelists of the product. They were using it every day, multiple times per day. They would share the insights across the company, into the product and the marketing teams and the UX and design teams. And then one day we come to work and that person has left the company. And guess what? That account, that several hundred thousand dollar account, is gone all of a sudden, and my health score said it was green. So how did we go from a green account then suddenly gone? Those are the types of problems that happen when you are looking at usage. You're not actually identifying... The only time you're identifying risk is when the product is not being used. If their product is being used, you have to look at all of the other factors that are what I believe are true indicators of risk to detect whether or not you actually have a problem in that account.

 

Jay: Yeah. And everybody has different indicators of risk or success is what it sort of comes down to. Similar story, I used to work for a company, and we sold HR technology to companies that employed hourly workers. So if you didn't use the technology, you couldn't hire people to staff your business. So, literally, it was compulsory. You had to use it. So we would never know. Usage always look good. But then the company would get bought by somebody. So business changed dramatically. We'd lose the stakeholder, to your point, and all of a sudden there's a new competitive threat in there at renewal. So yeah, usage alone doesn't tell you anything.

 

Jeff: So I guess the question then is how do we layer in relationship indicators that... There's some account management things. We know there's a competitor in there, and we know that this account falls into a segment that is highly competitive with somebody that is in our space. But how you incorporate the relationship factor in there. Are we too dependent on a relationship and not dependent enough on the core value that the product provides? Do we not have enough relationships globally? That type of thing.

 

Chris: One of the four buckets that I mentioned at the beginning was customer maturity. The maturity of the customer is the customer's ability to adopt your product and get value out of it over the long haul. That's how I think about maturity. So one of the questions that you need to answer under maturity is, "Who advocates for the product, what's their authority, and has it changed?" And by the way, change could be new boss, company was acquired, any change in the authority of your main points of contact.

 

And so I think it's up to every company to decide what the playbooks are depending on whether or not you have a low influence stakeholder in the company, or you don't have enough stakeholders. Maybe two isn't enough. Maybe you need seven. Maybe you need at least a director in order for your product to renew because if you have a manager or below you don't renew. I've seen companies, I like the simple one, the one, two, three, which is a one executive spot-

 

Jay: Yeah.

 

Chris: You've seen that one?

 

Jay: I just saw that the other day. Say it though. It's really good.

 

Chris: One executive sponsor, two champions, and three power users. And if you have that structure in place you can suffer the loss of any one of those and still recover the account.

 

Jay: Yeah. It's like a nice solid teepee. Triangle or whatever.

 

Chris: And by the way, that's one example that companies are going after. I think every chief customer officer has to come up with their own kind of standards of what they expect upon renewal. And that would be one of the four or five questions that you'd want to answer under the maturity bucket to detect if you have maturity risk in the account.

 

Jay: I know we want to move on to a different topic, but how do you differentiate that from the people category then?

 

Chris: You actually brought up recently, Jay, actually, you both brought it up. The topics under people are the quality and speed to value of the onboarding. The quality of ongoing training. How the customer feels about their experience with the CSM, the services team and support. And then finally a big one is a trust factor, which I think largely goes missed by most companies, which is, "Does the product and service match or exceed the customer's expectations?" So this happens a lot when the salesperson has oversold the deal, and then they get to the customer success team and they realize, "Oh my gosh, this is not what I signed up for."

 

Talk about destroying trust in the relationship right off the bat. Same thing can happen at the renewal time. The CSM wants to get the renewal, they over promised that a feature could be delivered on time or that uptime will improve. And there you go, your trust is out the window, and, again, the account is at risk. So those are the buckets that I put under people. You might track those under a different bucket, but I put them under the people bucket.

 

Jay: I like it.

 

Jeff: And I think largely one of the things that I think we see as undervalued, that I like that you specifically call out to, is the time to value, and time to almost first value, realize value... But getting somebody through that process successfully. And then the training piece. We worked with a client recently, and they recognize that they had a large churn happening at year one renewals. And what they did is they actually track that back through a series of surveys back into the onboarding piece. And they had done surveys right around the onboarding and after these companies got implemented, and some of the questions were fairly simple and straight forward.

 

But it was like, "Do you feel like you're successfully trained in our product to achieve your desired outcome? Have we successfully delivered training? How quick was our time to value?" And pretty quickly what they realized is that they weren't delivering any of those things adequately enough, and in a timely enough manner, and onboarding. And so a really cool way though to dissect, I think you're getting at as well, that here's a kind of churn in year one renewals, and let's actually track that back to an onboarding activity, which seems kind of opposite. But it really was so impactful that they dedicated a session. We had 20 people going through a journey designed specifically around that onboarding piece and almost like, let's map the first 30, 60, 90 days a customer, and really what we're doing, how to coordinate all of these things and make sure it's working appropriately.

 

Chris: That is a perfect example of using a survey. Normally you have to wait 12 months to see whether or not the onboarding process worked. But here's an example of a company, you guys helped a company deploy a survey to get feedback about their onboarding process, immediately, that led to action to improve the overall customer experience and customer journey. So I love that. That's what Nuffsaid is all about: detect problems early and trigger actions. And that's what you did.

 

Jeff: So an interesting topic that we haven't really, or I feel very uneducated on, is around kind of the AI side, the artificial intelligence side of the world. And I know you called that out specifically as something that you guys are leveraging at Nuffsaid, and how you guys were thinking about rolling out the product. So maybe give me a little bit of a short education just on how you guys have gone to think about using AI and using that type of technology to really help drive some of the decision making and actioning that you can get from the data that you guys are pulling in.

 

Chris: AI, it's probably overused by most companies. A lot of times what's actually being deployed is machine learning. But in order for the computer to help you make decisions and help you focus, it needs to have a massive dataset to review to see what types of communication information leads to activities and outcomes for the role, the department. And so, for example, for our product, what we're doing in the early days is feeding the AI communication data, Salesforce data, Gainsight data, Zendesk, Mixpanel, all these places where data is being collected, we're feeding all that to the brain, and we're looking at what types of behaviors or activities each CSM is doing. And then we're comparing though that CSMs success longterm compared to other people that look like them. So other software companies or other large software companies or small SMBs.

 

So what we're doing is we're putting in some parameters into our machine learning algorithm to sort and compare different types of users with each other, with users from other industries. And then over time, and again, sometimes it takes months, it takes a year as to get it right, but over time the computer can do a good job of identifying trends, and then surfacing actions and activities that should happen based on other best practices that it has detected elsewhere. And so that's really all AI is doing. It's allowing the computer to learn from trends that it's seen across a large dataset.

 

Jeff: One of the things that Jay and I have been talking about recently too are just interesting industries or companies that are solving interesting problems is kind of in industries. So do you feel like, outside of what Nuffsaid is doing, have you thought about, I don't know, other applications for where AI is or machine learning whichever verbiage, that is maybe under leverage right now, where there's massive data sets that it's kind of green field.

 

Chris: Yeah, that's a great question. I would say just in general, this concept of AI is so new. There is no winning use case for AI yet. So I'd say everything is green field and everyone's trying to figure out what the killer use cases for AI will be. I think we have one of them. But I think AI, probably in the early days, will show up more in consumer products because the datasets are much larger for consumer companies than B2B companies. So things like self driving cars or Amazon with shopping algorithms, I think those companies are more likely to come up with early versions of AI.

 

By the way, Amazon right now doesn't run on AI at all. Their shopping algorithm is all rules-based, but they have an opportunity to introduce some AI into their platform in the future. So long story short, I don't think there is a killer use case today. I think if I was an entrepreneur thinking about just AI only and not other problems, I probably would do a B2C company first because of the access to massive data sets. So, I'm excited by the technology and we'll see where the road goes in the future.

 

Jay: And Chris, maybe you don't know this yet, maybe you're still figuring out, but who is your ideal client at Nuffsaid?

 

Chris: So early days it's probably similar to how you think about your target customer. It's going to be a mid size company, let's say 10 to 50 CSMs, and probably doing somewhere between 20 to $120 million in revenue. They're more likely to be a software company, but they could also be in financial services. Any company that manages a portfolio of customers would benefit from having this kind of brain helping you focus your time on different parts of the portfolio.

 

Jay: Got it. One of the things that we've started to both see, hear about and talk about and explore more is this idea that even companies that have not traditionally been software companies are becoming software companies. Maybe one of the most prominent examples that if you look on Zuora's website or some of the others is Briggs & Stratton. They're becoming a "lawn care company," not just a lawnmower company. They're thinking about the customer and not just the product.

 

And so, to me, maybe we have a biased view of this, but customer success gets way bigger from here, not smaller even in the B22C realm because the way that industries are thinking about the customer experience and the long tail of their customers and the lifetime value of their customers as being a really valuable thing and selling services not one time products. It's going to necessitate it. So I was just curious, that's why I asked you your ICP question because I'm curious if you guys have looked much outside of tech yet.

 

Chris: We haven't but only because we're early on. But I like your perspective that... We call customer success what it is today because it's born out of an account management function. It was traditionally a one-to-one relationship with the company for SMB companies or B2C companies, we called that function, customer support. But, functionally, they were responsible for creating the same types of delightful experiences as customer success. Just in a lower touch point way.

 

So that's a really interesting perspective that longterm customer success could be expanded to even low touch point experiences as well. I haven't spent much time, honestly, thinking about what that could look like, but it's a really, really interesting perspective. And even for a company like Briggs & Stratton, it's true. They don't want to sell snowblowers, they want to sell a clear driveway. That's the solution to the problem. And so, if you think about the problem that way, it's not just about selling a product with some support, it's "Let's provide a solution to the problem for customers."

 

Jay: And even companies like GE digital has... I think GE digital is not quite a thing anymore. But airplane engines, they're basically on lease, and there's all kinds of telemetry data that comes with those now. And there's a whole solution that comes with the hardware. So even a B2B. If you just go search LinkedIn, and I know there are some studies out there too, I think Gainsight did one of them on just the growth of customer success as a profession. It's sort of massive, and I agree with you. I think some of it is just a conversion. People are converting account management into customer success. But they're slowly starting to change the mindset around it as well, this whole customer centric mindset. And so the good news is, for what you're doing, it feels like your [Tam] is growing naturally, not staying stagnant, and for us to, I guess for that matter.

 

Chris: Yeah. And remember we're building a brain for each department in the company. So obviously we're starting with customer success, but the way that we grow our team longterm is by providing that same brain to other functions. But going back to this customer success concept, I'm curious, this is a very, very interesting idea that customer success... Customer support goes away completely. And the only function that you have in your company is customer success. And you have a low touch customer success team and you have a high touch customer success team. When you're thinking about it in your head, is that how you were positioning it?

 

Jay: Not quite so I don't think support goes away. I think we talk about support being, we use the word dial tone or the phrase dial tone all the time to talk about support, because half the time when we walk into a company, if they're trying to get more proactive with their customers and get more engaged and strategic in how they're driving that relationship forward, if support's busted or if engineering isn't able to fix things fast enough, the whole thing implodes and everybody becomes support.

 

We were working with a company the other day, and their CSMs have their own tracking spreadsheets and support tickets because they are having some challenges with support right now. They're all being drawn down. They're being knocked down a rung on the ladder. So they can't be strategic. So I actually think support is critical to having a great customer success strategy, and customer success strategy being, we're engaging with our largest customers one on one. We have strategic proactive engagement, one of the many in our mid tier and SMB tier, so that we can drive retention and growth in those accounts and advocacy in those accounts. And that's just wholly a different thing to me than support. So that's my strong opinion on it. I don't know if you'd agree on that or not.

 

Chris: Yeah, sorry I kind of took us off topic, but I was inspired by the idea that maybe, at some point, we think of all of our customers as being deserving of customer success, and that what we're all ultimately aspiring towards is solving a customer's problem. And that may be in the future we don't have a differentiation between support, which will oftentimes we think of as a cost center in the company, versus customer success which we think of as a revenue generation function in the company. So anyway, I took us way off topic here, but it's a kind of an interesting thought exercise.

 

Jeff: It's great. And I think if you're going to take the time to sell a customer, then, yes, they all deserve customer success. When we translate that, it doesn't mean you get a dedicated person with that title. It means you get the outcome however that gets delivered. And that may be through self-service, and we're feeding you programmatic tools to onboard appropriately, and it's going to come in a lot of different shapes and sizes. CSMs are just one small facet of a whole company focus on customer success. I know I'm preaching to the choir here.

 

Chris: And by the way, we might end up cutting a section out and that's fine. Where I'm going with this is, even at UserTesting, for example, in our SMB group, we couldn't justify the cost of having a full time dedicated CSM for all of our SMB clients. Financially it didn't make sense. So one CSM was responsible for 250, 300, sometimes 500 customers when we were stretching too far. So at that point, what is that job? Is that job customer success? Is that job customer support?

 

I don't know. The lines start to get blurred when you go from high touch enterprise, traditional CSM to SMB 500 accounts per CSM to support, where support might manage 1,000 or 2,000 customers over the course of a year. Interesting topic. Probably not more for today, but maybe there could be some interesting content there in the future.

 

Rectangle-1

 

 

To listen to more episodes with Jeff and Jay on the Gain, Grow, Retain podcast, subscribe here.

 

And if you liked this episode, you may also enjoy reading Chris’s post on Why Customer Success Leaders Aren’t Getting a Seat at the Table