Learning the reasons why your customers cancel is painful, but it’s unquestionably valuable. Here’s how we do it.
This is part fifteen in our ongoing series, Journey to $100K a Month.
“Truthfully, it sucked. We couldn’t deal with the constant bugs. The app isn’t ready for primetime.”
As a founder, one of the most painful things in the world to hear is criticism of your baby.
Especially sharp, stinging criticism from a customer that you’ve now let down.
In our very early stages — when just about every element of Groove deserved criticism — I was terrible at handling anything negative being said about us. It cut deep, and I almost always let it get to me. Even worse, I did nothing to systematically collect and measure the feedback I was getting.
There’s no way around it, it still sucks when people point out where you’ve failed them.
But actively collecting and leveraging that feedback has become one of the most important drivers for continuous improvement at Groove.
And by testing, measuring and iterating on the way which we collect (and act on) negative feedback from customers who cancel, we’ve been able to improve customer satisfaction and retention, keep Groove growing, and even bring back some of the customers who left.
The Customer Exit Survey
At first, we had no system in place for collecting feedback from customers who closed their accounts.
But after seeing it from countless apps I signed up for and canceled, I decided to give customer exit surveys a try.
We studied dozens of surveys and put together one of our own.
It was a single question (why did you cancel your account?) with a drop-down menu of options that we had already been hearing from customers, reasons like too expensive, didn’t get value out of Groove, chose another solution, and a few others.
We sent this survey to every customer who canceled. We even tested four different emails to get people to respond.
The result?
A whopping 1.3% survey completion rate. Pretty awful, and on top of that, the data was practically useless. Responses were spread across the board in a near-even split between the top three choices.
After this first test, we had little data and nothing to act on, but we weren’t done testing.
Takeaway: While we didn’t get tremendous results from our closed-ended survey, we did clearly see the potential for gathering exit data. But don’t discount this approach; while they didn’t work for us, a lot of very successful companies use closed-ended surveys, so they may work for you.
You might also find useful:
An Open-Ended Breakthrough
After thinking about how we could get better insight from our exit surveys, we decided to sacrifice our goal of getting a neatly quantifiable data set, and instead see what ex-customers had to say when we didn’t pre-fill their answers.
We sent out a simple email:
Not only was the response rate nearly eight times greater at 10.2%, but we were finally starting to get real, actionable data.
Specific bugs that our active customers weren’t telling us about.
Hang-ups in our user experience that we didn’t catch.
Workflow inefficiencies for use cases that we had never considered.
Now we were getting somewhere.
Takeaway: By removing the pre-filled answers in our exit survey, we were able to unlock loads of valuable — and actionable — data.
A Simple A/B Test That Nearly Doubled Conversions
We set out to optimize the survey even further, and tested nearly half a dozen variants.
The winner looks exactly like the email above, except for one small difference.
Instead of “why did you cancel?”, we ask “what made you cancel?.”
A tiny difference in framing nearly doubled conversions, and the latter email got a roughly 19% response rate.
I don’t have a deep understanding of the why, but I suspect that “why did you cancel?” simply sounds more standoffish and puts the reader on the defensive, whereas “what made you cancel?” doesn’t have the same accusatory tone.
Takeaway: It’s not just doing a survey that’s important; you need to optimize your question(s) to ensure that you’re asking them the right way. A simple wording change can make a big impact on responses.
The Next Level: Customer Exit Interviews
Story time.
A while back, I heard a great anecdote about self-help guru (and brilliant entrepreneur) Tony Robbins. I don’t know if it’s true or not, but I think there’s an important lesson here.
When Tony was starting out as a speaker, he would approach one or two audience members after his talks and ask them two questions:
“What did I do well, and what could I do better next time?”
Of course, he got lots of glowing praise from folks who were too polite to criticize him to his face, but getting negative feedback proved more difficult. People would often say things like “that was great, your style is invigorating and inspirational, and I wouldn’t change a thing.”
But that wasn’t good enough for Tony.
He’d push: “I appreciate that, but this conversation isn’t over until you tell me one thing I can do better the next time I give this talk.”
Tony didn’t ask for feedback. He demanded it. And it paid off: he used that negative feedback to improve every single time, and he’s now one of the most successful (and highest-paid) speakers in the world.
Of course, it’s really hard to be demanding in an email survey. But I wanted to try and put Tony’s experience to work.
We began emailing customers who canceled, asking to set up a five-minute exit interview. We promised that we wouldn’t pitch them, and that we just wanted to learn how we could make Groove better.
While the response rate wasn’t great (around five percent), we did have some really valuable conversations and got great, honest feedback from former customers who didn’t always give us straight answers right away. Tony’s technique definitely works.
With that said, we don’t do these anymore.
The marginal value over the open-ended email survey simply isn’t there, and the resource commitment is exponentially higher.
I’m still glad we tested it, and I’m certain it would work well for businesses with smaller customer bases (like consultancies or agencies).
Takeaway: Demanding feedback in a live conversation definitely works. Whether it works at scale depends on your business; we get enough value out of email surveys that it doesn’t make sense for us, but exit interviews could still be the best fit for you.
How To Apply This To Your Business
Since we’ve started doing open-ended exit surveys eight months ago, we’ve been able to make a lot of positive changes and fixes to Groove. Retention, along with many of our usage metrics, have improved as a result of some of these changes.
We’ve even started testing recovery campaigns for former customers whose issues we’ve fixed; I’ll write about that in a future post, but the early results are very promising.
Taking criticism is hard, but it can be one of the most effective ways to improve your product. And with automated open-ended exit surveys, it’s really easy to do.
If you’re not already doing it, I encourage you to give it a shot. If your experience is anything like Groove’s, you’ll get a ton of new insight that will help you hold on to the customers you still have.
Finally, An Announcement
Groove is a customer support company, though we don’t talk about that too much on this blog.
And we don’t plan to; this blog is, and always will be about the growth of our business.
But with more than 1,000 customers, hundreds of support-focused tests and millions of data points, we do have a lot of valuable support insight into what works and what doesn’t. And now, we’re chronicling that on the Groove Customer Support Blog.
Just like this blog, we’ll be posting weekly. And just like this blog, we’re not going to pitch you on our product. Just like this blog is focused on sharing our experiences to make you better at business, our support blog will share our experiences to make you better at support.
Check it out and subscribe if you’d like the weekly posts emailed to you. The support blog email list is separate from this one, so you can choose which content you want.
I hope you enjoy it, and I hope you’ll let me know what you think.