Don't trust your gut

Don't trust your gut

I was in a meeting when I heard someone say "Trust your gut" regarding changes to the user interface of a large e-commerce website. I was stunned. In an age in which we have access to a seemingly endless amount of knowledge, insights, and data, why would anyone accept such uncertainty in making a decision about their product? That's when I realized – it's easy to trust your gut when you don't look at the data closely enough to know the outcome of past decisions based on such guesswork. I know, because I used to operate the same way.

Earlier in my career as an Internet entrepreneur, I made too many decisions based on intuition. I relied too heavily on qualitative feedback from users, past experience, or the opinions of various stakeholders to make product decisions. Put another way, I made a lot of educated guesses. Sometimes they were right, but more often than not my educated guesses resulted in bad product or business decisions that could have been avoided had I let data be my guide. I'm not too proud to admit that I made this mistake more than once as a first-time startup CEO, but one in particular always comes to mind.

But to the few dozen users we spoke to, there was no reason to come back unless they bought something new...

While building CompleteSet in late 2015, I knew we had a mounting retention problem. For context, CompleteSet started as a knowledge base of collectibles that enabled fans to keep track of their collections without tedious data entry. Despite acquiring thousands of users in any given week, only a small percentage of them would stick around after 30 days. If you've ever worked on a consumer app, then you know that retention is crucial to its success. Most experts agree that an elite eight-week retention rate for a media product is around 25%, and CompleteSet consistently fell below that benchmark. In talking to users, we heard from many of them that there wasn't enough to do in the app. They could search and filter through our database of collectibles, keep track of their collection on their profile, and follow other users to see their activity. But to the few dozen users we spoke to, there was no reason to come back unless they bought something new and needed to update their collection to showcase the new addition.

It seemed logical and felt like the answer to our retention problem. Like most features our talented engineering team built, it worked great.

How did we fix this? We proceeded to spend multiple sprints building a user commenting feature for our website and iOS app (we did not have an Android app at the time). The hypothesis, if you can call it one, was that allowing registered users to comment on each other's activity on the website and the app would improve retention (measured as monthly active users). I would compare our Activity feature where comments lived to a minified version of the Facebook News Feed but without the ability to upload media such as photos or videos. As described in "Hooked: How to Build Habit-Forming Products" by Nir Eyal (I highly recommend this book), the users' comments would create an external trigger in the form of an email or push notification to pull users back into the app. It seemed logical and felt like the answer to our retention problem. Like most features our talented engineering team built, it worked great. The problem was that few users interacted with comments after it was released.

As our CEO and product leader, I trusted my gut and assumed that increased social interaction between CompleteSet users would result in increased engagement and thus, improved user retention. I was wrong. I was so focused on increasing retention, that I couldn't see the forest for the trees. What I missed was something that users repeatedly told us in interviews – they returned to the app when they bought something for their collection. And buying is something every collector has in common, whether they collect Atari games or Zorro toys. At the time, we did not offer a marketplace. The ability for customers to easily buy and sell collectibles on CompleteSet in a manner differentiated from eBay could have made a meaningful impact to not just retention but also solidified a business model for our pre-revenue startup. The irony is that we had a functioning prototype of a marketplace in 2014, but we decided to kill the feature as we struggled to overcome the infamous 'chicken or the egg dilemma' that every marketplace grapples with in its early days. Looking back, I realize that was a grave mistake influenced by a lack of data, impatience, and frankly, inexperience.

I still reflect on these critical errors in judgement when I'm making product decisions today. Instead of trusting my instincts, experience has taught me that there are several things I should have done to improve user retention for CompleteSet – and none of them are based on gut feelings, opinions, or the like. In the following list, I explain how I would solve the user retention problem I described previously with the outcome-focused approach I use today.

Measure What Matters

When we introduced commenting in 2015, CompleteSet measured its user retention as monthly active users (MAUs). We defined MAUs as registered users that logged in during the last 30 days. It was wise to track monthly active users closely, but our definition of the metric failed to consider what actually retained our users – and it wasn't logging in. In talking to users, we discovered that many returned when they had purchased something and wanted to update their collection. This was an internal trigger that motivated them to return.

Instead of simply focusing on getting users to log in once every 30 days, we should have taken our definition of monthly active users a step further to be registered users that updated their collection during the last 30 days. This likely would have resulted in a smaller number of MAUs in the short term, but it would have been a more meaningful metric to focus on. As the Measure What Matters title above implies, as our leader I should have properly implemented Objectives & Key Results to make the redefined MAUs metric our North Star.

It may have looked something like this:

Objective: Provide a website and app that fans want to use to track and showcase their collectibles.

Key Result: Improve user retention from X% in Q2 2015 to Y% in Q3 2015 as measured by monthly active users (MAUs).

But I didn't know any better at the time because I hadn't read John Doerr's book "Measure What Matters" yet.

Run A/B tests to validate retention improvement ideas

If there is one regret that I have about the outcome of CompleteSet, it's that we never ran a single A/B test. I assumed we didn't have a large enough user base to do A/B testing. I also assumed that A/B testing was more complicated than it actually is, and that it required expensive enterprise software like Optimizely that we couldn't afford. It's difficult for me share that because of how wrong I was.

If I were facing a user retention problem today, I would rely on A/B testing to confirm which new features or changes to existing functionality would improve the metric. Before deciding to release something like user comments across multiple platforms, I would run A/B tests to quantifiably prove the hypothesis that giving users the ability to comment would in fact improve retention.

For A/B testing to work at any organization though, teams must embrace an experimental mindset. You must also accept that many of your hypotheses will be disproven. That can be a tough pill to swallow. For the first of likely multiple controlled experiments focused on improving user retention, my hypothesis would follow the common format 'If [cause], then [effect], because [rationale].'

If we enable users to create comments and mention each other, then we'll increase monthly active users (MAUs) by X% due to the increase engagement that commenting would facilitate through notifications.

With a tool like Google Optimize, complexity and cost would no longer be an excuse for me to not run A/B tests before making product decisions. Google offers a free version, it integrates with Google Analytics (of course), and it's fairly straightforward to configure. If you're not A/B testing, you're probably just guessing whether you like to admit or not.

Listen to data, not opinions

The introduction of a commenting feature as a solution to user retention was not the result of misunderstanding qualitative feedback alone. It was also driven by the opinions of stakeholders around me. Investors, a potential acquirer, and even some of our employees wanted to introduce social features to the website and apps. They all believed that CompleteSet was missing a sense of community and that was the root of our retention problem.

The thing is, none of them had any data to support these claims. Likewise, I had no data to prove otherwise. Rather than allowing the opinions of others or my own bias dictate our product vision, I should have dug deeper to find quantifiable evidence that would justify commenting as the best method to improve user retention. We had Google Analytics configured and a database that stored users' behaviors with timestamps. We had the data, we just didn't look hard enough.

To be clear, I fault no one but myself for these missteps. As the company's CEO and product leader, I should have demanded data be the deciding factor.


You might be thinking 'This all sounds like a lot of work.' And it is. But it's more painful to watch your product or company fail than it is to invest the time and effort required to track well-defined metrics, test as often as possible, and listen to data instead of opinions. My hope is that the next time you're tempted to "trust you gut" when making a product decision – even for seemingly inconsequential changes to your user interface – you'll consider the risks you're taking by trusting instincts over facts. Don't build a feature no one asked for like I did. Instead, I encourage you to take a more outcome-focused approach.

If you found this essay helpful, please consider sharing and subscribe to receive my next post via e-mail. Thanks for reading!