Find out what’s better than an aha moment
The aha moment is a mythic point in a product’s life when users stop being wary and start becoming addicted.
In Facebook’s salad days, the company “figured out” that if they could compel users to make seven friends in 10 days, those users were likely to be retained in the long term. It’s not clear whether the story is apocryphal, but enough product people believe it to give one pause.
As a company and as a blog, we’ve long had a complicated relationship with aha moments. On the one hand, the idea gets product managers excited about finding the inflection point for user experience. On the other, it’s a weird preoccupation for product managers, who, by definition, should be about the constant grind, not one bountiful discovery.
Ultimately, the aha moment is reaching for a way to improve one’s product, or discover how to align it closer with users’ interests. But it’s not a very rigorous way of getting there. And yet irony upon ironies, we built an aha moment generator…sort of. With a little more rigor baked in.
This is the story of Signal, a product that can crunch numbers faster than a human can, but which only succeeds because its development revolved around a very human question. And that question is the one our users persisted in asking us.
Something unpredictable
A year and a half ago, our machine learning team built Predict, a product that predicts the likelihood of users to complete certain actions. Our notification features were popular, so we figured that giving customers a machine learning-based method to identify their best users and re-engage them would be a success.
Basically, Predict gives users an A, B, C, or D grade based on actions they’ve taken before and how correlated those actions are to goal behaviors. If you want someone to make an in-app purchase, you might be inclined to send a notification out to them. And with Predict, you can know who’s an A and who’s a D.
As such, it’s a powerful tool, basing its predictions on insights from trillions of data points. And by creating a machine learning product, we were doubling down on Mixpanel’s commitment to helping the world learn from its data in more innovative ways than ever.
But some users were frustrated with Predict. They didn’t know what to tie the letter grades to. Why was Bob an “A,” more likely to perform a certain action than Jane, a “C”? Like a child at play, the question always came back to, But why?
Marshall Louis Reaves, a machine learning engineer at Mixpanel, summed it up: “The biggest piece of feedback was that people wanted to know not who was likely to convert, but why. And the question behind that question: How can I make my product better?”
Predict helped Mixpanel customers find their most valuable users and get them to take action, but ultimately, it didn’t address the question at the heart of their craft. However misguided the aha moment might be, its underlying sentiment is this.
So, when embarking on Mixpanel’s newest automated insights product, the machine learning team did what they do best. They leveraged all kinds of cool math to do the heavy lifting for product managers and save data scientists from yet another ad hoc request. But they also repeatedly returned to the product manager’s central concern, which Marshall brought up: How do I make my product better?.
In Signal, Marshall believes Mixpanel has answered that question with both power and purpose.
Getting your back
But first, the cool math.
Predict wasn’t the product some users wanted, but it was a start. It is engineered to understand which actions make a product successful, even if it isn’t necessarily telling you what they are. Because of that, the machine learning team thought they might have already done some of the homework for what was to become Signal.
“We initially thought what we might do is pull back the curtain on the weights that we engineer to actually build the Predict model,” Marshall said. “Among the biggest problems with that is the model is complicated and ugly and hard to interpret. It often presents data in ways which are difficult to act upon.”
As a map, Predict’s inner-workings were detailed, but hard to read. Here, the machine learning team was forced to return to the question that inspired Signal: How can I make my product better?
“We went back to the drawing board to think,” Marshall said. “What are the types of hypothesis tests or causal inference we can do to try to reveal results that are highly-actionable and easily interpreted. We didn’t want you to have to be a data scientist to understand what was going on.”
The less analysis product people had to do, the more building and optimizing they could accomplish. Or, as Marshall put it, “The actual best product we can build for you is the Mixpanel machine learning team looking over your shoulder and telling you what the results mean.”
The team needed a formula that was easy to supply inputs for, but which also gave powerful outputs. The more they dug into R&D, the more they found themselves returning to a certain querying format again and again: How does performing X action, at least Y times, within Z days of doing action A correlate with goal B?
This formula felt right. It was easy to understand. It was designed to specifically measure engagement.
“This format is popular for people to ask these kind of questions because it has an engagement component,” Marshall said. “At least Y times within Z days. It also has this event focus, which is action X. It’s been around the product analytics space for quite a long time.”
There was only one problem. Were they not encouraging aha moment hunting?
More than momentary
The problem with creating an aha moment generator is that it optimizes the wrong behaviors. It doesn’t ask, How can I make my product better? It asks, How can I magically acquire or keep a bunch of users?
“There’s a lot of controversy in that world,” Marshall said. “Do aha moments exist, do they not exist? It turns out there might not be a single critical action that’s essential for your company’s success and which you don’t yet know about.”
There might be something better, though. In building Signal, the machine learning team realized that their product wasn’t about using math to identify a single magical moment. It was about discovering several moments that influence a user’s lifecycle.
While prototyping Signal, the team had been digging into Mixpanel’s own data. They wanted to know how certain actions affected a Mixpanel user’s retention on the mobile app, such as adding a mobile dashboard or receiving an anomaly notification.
“We discovered something very useful in running an early version of Signal against our mobile app data,” Marshall said. “We realized there might be nuanced detail to the frequency and timing components of actions. And maybe it isn’t an aha moment, but it’s still important to the user lifecycle.”
For instance, the team found that viewing the login page and then logging off was indicative of a Mixpanel mobile user’s long-term retention. That’s a basic insight Signal can tell you: If the mobile app’s goal is long-term retention, Signal says viewing the login page and logging off are highly correlated with success.
But that’s also kind of obvious.
“If you only look at one specific value, that’s not very interesting,” Marshall said. “Say, what’s the maximum correlation between being retained and this particular action, and you’ll find it’s greater than one time within 14 days.”
Logging in a couple times is a good indicator of interest, sure. But what about logging in too many times? Turns out, that’s a sign of pain.
While it was essential to view the mobile app’s login page, it was just as crucial to not view it too many times. So, instead of having the formula be at least Y Times, there’s more flexibility in Signal: No more than Y times, exactly Y times, etc.
“We started to notice when we went beyond performing the typical calculations that frequency was really important in some cases,” Marshall said. “There are these unique examples, where if you do something too many times too soon, it’s bad or correlated with low retention, but that action itself is important to perform in the course of the user lifecycle.”
Put another way, when it comes to driving user actions, there actually can be too much of a good thing. This is why people building product experiences need to tease apart the Why?, but from a lifecycle perspective, and not a momentary one.
One right tool for the job
Any tech company with money can go out there and build a machine learning team with mathematical rigor. We like to think we built ours with purpose, too. With Predict, Mixpanel built a powerful product, but when it fell short, we learned something about purpose.
In developing Signal, the machine learning team returned time and again to every Mixpanel customer’s most fundamental question, How do I make my product better? If Signal wasn’t answering that, it needed finesse.
Over time, the team was able to create something that was extremely usable and actionable. They were able to avoid the fallacies of an aha moment and inspire product managers to work with rigor, but also faster than ever.
“I have a favorite quote from No Country For Old Men,” Marshall said. “Anton Chigurh says, ‘You pick the one right tool for the job.’ In both the case of Signal and Predict, I think each is the one right tool for the job.”
Building any product is a crapshoot. Advanced math and machine learning can 10x our product’s power, but it can also distract from the real purpose with a lot of flexed muscle and very little lift. Having learned this the hard way, our machine learning team approached Signal by continually making sure it was the one right tool for the job.
Here’s the beauty of Signal: Our journey to build the right product for you will significantly shorten your journey to build the right product for your users. Using Signal, teams can quickly see where the best opportunities for improvement are in their product and act.
“Usually product managers have strong or closely-held assumptions about the way their products work, about the way that users engage with their products” Marshall said. “We created Signal so they could quickly evaluate those beliefs.”