Being a Subscription Video On Demand (SVoD) service, subscribers are the lifeblood of our business. And as we are still, very much, the new kid on the block, we need to work hard for every single user we onboard.
Therefore, any sharp spike in churn (cancelled subscriptions) can be catastrophic to us.
Reflecting on October, things were going well, we were growing 4% month-over-month, but we knew that we could do better. Black Friday was right around the corner and we had a bulletproof marketing plan.
The biggest weekend of the year comes and goes, we break all of our records, the numbers keep on climbing and then we hit January…
January provided us with that catastrophic spike!
There were a lot of questions, a lot of which we didn’t have the immediate answers to.
So throughout the rest of this article, I will go through how we found the answers to these questions, what tactics we employed, and ultimately how we went about decreasing churn when things were starting to slip away from us.
It’s around this time that we started using Baremetrics. While using Recurly, we knew exactly how many subscribers were coming and going. However, the depth of knowledge just wasn’t there.
The ease of integration with Baremetrics made it a no-brainer for us to give it a shot, further our analytical capabilities and start to understand the trends in our subscription data.
I am glad to say the insights provided by Baremetrics changed the way that we went about our acquisition and retention, and gave us a better understanding of why our subscribers were leaving.
The graphs and tables that Baremetrics can produce make visualising information far easier than simply looking at numbers and spreadsheets.
Whilst the ability to annotate graphs has been integral in helping us start to determine trends within our users’ behaviour.
Returning to Black Friday, we acquired an extraordinary amount of new subscribers during our sale, with these numbers continuing to grow well into December.
We were steadily acquiring well under our target Cost-Per-Acquisition (CPA) of £30, so we decided to increase our budget on numerous occasions.
An example of this was when a major storm hit, we increased our budget substantially to target those stuck at home. This tactic had a great ROI with our CPA dropping to £15 over that weekend.
We found that duplicating this tactic in similar circumstances worked well initially. However, we quickly noticed that we couldn’t rely on this tactic and that it had a kickback effect on the number of subscribers cancelling their subscription.
The cancellation rate started to rise rapidly – up from 6% to 9% in a matter of weeks.
Whilst our cancellation rate was increasing, we also noticed a large number of trialists churning out. It was evident that we could acquire customers with ease, but we needed to seriously work on our retention techniques, starting with our onboarding process.
Using Recurly, we were limited to simply tracking the number of subscribers, our Monthly Recurring Revenue, and Churn Rate. Armed with these statistics alone we were severely underprepared—when this happened—to properly understand our customers’ behaviour and start fighting churn.
This resulted in us undertaking a thorough analysis of our data.
Within this, we were able to pinpoint the three biggest metrics that we needed to monitor when dealing with customer churn.
Customer Lifetime Value – Our monthly LTV peaked at £68 in November but quickly dropped to £43 within a month.
Cancellations – Our total cancellations rose from ~350 – ~530 during December.
Trial Conversion Rate – Our conversion rate dropped from 80% to 40% in the space of a fortnight.
Once we had identified the key metrics that we needed to monitor to stay on top of churn, we were then tasked with identifying what was causing these negative effects.
It’s important to group certain metrics to begin understanding how you can start fighting churn with data.
For example, it wasn’t until we were able to visualise our data on graphs that we started to notice the following trends.
A lucrative sale hurt the retention of our existing customers
The end of a sale had a catastrophic effect on the number of trialists cancelling
An extended trial period simply meant that customers had more opportunity to leave us without ever becoming a paying subscriber
When identifying trends, it’s important to not just look at data, but start identifying what external causes impacted user behaviour. Some fundamental factors that must be considered are:
What marketing activities were undertaken?
What promotions were available?
What product & service amendments were made?
You must acknowledge all internal and external factors when viewing your data as these will provide context and reasoning to any abstract behaviour that you uncover.
History Hit uses Baremetrics to create graphs and dashboards to track churn. Want to do the same with your business?
As mentioned above, our key churn metrics are CLV, subscription cancellations, and trial conversion rate. So when we noticed negative trends in all of these areas, we kickstarted our battle to start fighting churn.
Our rapidly decreasing CLV was perhaps the most worrying statistic of all. In the space of under two weeks, it decreased by 33%.
We pored over the numbers and were able to identify the key contributing factor—the end of customers free trial period. Our cost per acquisition soared above £30 so we began to make very slim profits from acquiring new customers.
The number of cancelled subscriptions increased substantially over a few months. This was a direct result of our previous marketing strategy where we would entice customers by offering significant discounts due to our lack of accessible marketing budget.
We saw a sharp increase of ~66% in cancellations in the space of a month and unfortunately, it settled at this higher plateau.
Thankfully, our churn rate remained constant due to increased acquisition, however, the absolute churn was steadily rising to an almost untenable level.
Our trial conversion rate provided two interesting figures upon inspecting the data. We saw that during a sale, the cancellation rate of existing customers was far higher. However, when that sale finished, our conversion rate rapidly dropped from 80% to 40% – this happened on numerous occasions last year.
We were also able to identify that similar offers that afforded users a longer trial period (6-weeks) also led to a much lower conversion rate than those whose trial period was our regular length of 30-days.
We had now identified the cause of our churn, so we needed to plan, create, and implement a solution; and fast.
The solution for any start-up is to test quickly and learn quicker. Thankfully, we were able to do this rapidly and can attribute our success in reducing churn to how quickly we were able to learn and adapt our processes.
To do this, we went back to the basics.
This alone, however, would never have been enough—although we were able to reduce some of the impacts of the trends mentioned above.
The key to understanding a lot of our issues was Platform Saturation. When we saw our CPA balloon, we were able to conclude that we had reached the upper limit for the number of times an ad could be shown to our audience. The frequency our ads were being shown to individual users on Facebook and Twitter was detrimental to the point that we were receiving comments from some of our followers.
Over time, we were able to return our acquisition and CPA to the levels they were previously. However, we had failed at breaking through to the next level.
By contacting many recently churned customers we were able to discover two key factors leading to churn:
We decided to action this feedback and test some new platforms.
Reddit, for example, was one such test. As a non-user, we were going into this blind. Knowing the platform was unique, we read as much about the platform as we could, including what tone, message, creative, and copy to use in ads. However, we were still shocked by the response.
No-one had warned us that users hated ads and would troll the comments. It wasn’t until we had a conversation with our account manager that we learnt this lesson.
Our success metric had been defined by a target number of users signing up using the code ‘reddit’. However, even with a groundbreaking archaeological find – the largest cache of World War Two artefacts found since the war – and the right audience ‘r/history’, we still weren’t able to see the results that we sought after.
This platform taught us a lesson by showing us the key to success when testing new platforms is being prepared.
To avoid failure you must ensure that you not only know how a platform is used, but also how businesses advertise upon it, and further how users react to such advertisements.
Instagram is another platform that we found to work well for us. However, our campaign was only successful due to the work that we had put in on other channels.
Having an influencer as a founder enables a certain level of clout, with Dan (Snow) being one of the biggest names in history and amassing over 280K followers on Twitter. It was on this platform that we were able to hone our social media messaging.
We were able to apply our knowledge from Twitter to Facebook, with small tweaks to ensure the creative matched the audience.
However, it was Instagram that proved the biggest challenge for us. It is a lot harder to generate advertising that looks native on Instagram than it is on other channels, and as a result of this, we saw our initial CPA 3-4x higher than what we had expected.
This was a large set-back initially with us questioning the feasibility of advertising on the platform for our business. However, when we applied our learnings from other platforms we saw these pay dividends on Instagram.
Facebook taught us which creative & resonated best with our audience and Twitter provided us with a basis for successful copy, which we were able to adapt and improve for the relevant platform.
The key to our success came from the diversification of audiences across our social media platforms.
We were able to constantly test specific history interests on Facebook efficiently, whilst finding which ads converted best for particular audiences.
We were then able to match this data with conversion rates across all ad-sets to duplicate any ads into new ad groups that we were confident would succeed.
The success of scaling our advertising on both Twitter and Instagram came from the creation of several Lookalike Audiences to those that have proven conversions.
For us to be successful, we had to ensure that we had clear goals and metrics when testing a new channel.
In conjunction with a defined success metric and a set timeframe, we were able to gain a much clearer insight into a channel’s scalability.
We also found that the best way for our company to adopt this was to ensure that it was in writing. That way, not only did we have a record of it, but we were also able to hold ourselves accountable to it when reviewing our results.
By diversifying our marketing output and tapping into different advertising channels and audiences, we were able to greatly decrease the frequency by which our audience saw our ads, and thus decreased churn.
Decreasing churn is a constant battle for all subscription businesses and we are no different.
By following the steps outlined in this article, we were able to set our business back on the right track. We identified what our key metrics were, identified trends in our data, and actively worked on implementing tactics to counteract these.
One of the key findings for us was that it is much easier to retain subscribers who are on an annual subscription than those on a monthly plan.
To compare, our annual retention rate is ~80% after 12 months, dropping only 10% to ~70% after 24 months, whereas our monthly retention rate drops to 50% after only 6 months.
Therefore, we realised that it was essential to upgrade our monthly users onto an annual plan. The only question left was how.
To answer this question we revisited our existing customer feedback and set about obtaining some deeper insights.
It is crucial to gather feedback from customers at all stages of their lifecycle to understand any issues or pain points they may have. For us, this included obtaining feedback on sign-up, during onboarding, whilst active, after upgrading, and on exit.
Similar to data analysis, it’s essential to notice trends in customer feedback. We found that the reason that people were unsubscribing is because they were meeting the same criteria that ultimately led to our monthly subscribers’ upgrading – watching at least 2 hours of content per month.
The biggest insight that we uncovered when tackling churn came from the data we were able to gather from users who had recently upgraded from a monthly to an annual subscription.
We found that the minimum amount of time for a user to convert their trial was to watch at least 2 hours of content, whilst upgrading to an annual subscription came from constant engagement with our users and a properly nurtured relationship.
To address the first issue, we created new playlists that promoted our best content across a variety of historical eras – which we found to increase viewership by 30%.
We also found that 50% of our users viewed our programmes on a PC, so we began promoting our mobile & TV apps across our owned channels and saw incremental increases in total playviews as a result.
To build deeper relationships with our customers, we reconfigured our onboarding email chain. This included further personalisation and asking for initial feedback after the first fortnight.
We also began calling cohorts of users who did and did not complete their trial who signed up at the same time.
To action this feedback, we began by reducing the trial period which instantly led to an increase in our trial conversion rate and was linked to a decreased churn rate within the first three months.
Other issues that were highlighted related to the long winded onboarding process and lack of personalisation in our product.
We were able to take the lack of personalisation feedback and develop new strategies for how we bundled and presented the content on our website. We built more cohesive playlists, and niche collections for our audience.
Furthermore, we created a host of new playlists targeting school teachers and students alike, by matching our playlists with their curriculum. These changes were well received, as we began to see less negative feedback about a lack of variety and personalised subject matter.
We were able to decrease churn by attending to the major obstacles at each stage of the customer journey, and applying the findings from our subscription data.
By undertaking a holistic approach to fighting churn, we were required to interpret raw figures, acknowledge customer feedback, and make alterations to our marketing activities for the project to work.
If we focussed on a single aspect of this or were to omit any of these three areas then we would not have been able to succeed in the way that we have.
Some key lessons that we learned from this include:
History Hit uses Baremetrics to measure churn, LTV and other critical business metrics that help them retain more customers. Want to try it for yourself?