Are you applying biases to your product work? Here’s how to self-evaluate
Data is complicated, and interpreting it correctly requires skill, an open mind, and—most importantly—a large amount of self-awareness.
Why self-awareness? Because the only way to combat bias, a big enemy of data work, is to be consciously aware of the attachments and attitudes that lead to erroneous conclusions.
Here are five questions you can ask yourself that will materially increase your ability to mitigate the effects of your own biases in data-informed product development.
Am I attached to particular timelines or results?
When you’re attached to something, you hold the belief that you need that thing in order to be ok. In product development, attachment often takes the form of fixating on specific results within specific timelines in order to keep your product or business afloat. The story we tell ourselves is if we don’t hit these targets or timelines, then our very right to exist as a product or company is under threat, meaning we must do so at all costs.
Sometimes this story is true. There are, indeed, consequences to not meeting specific deadlines or reaching certain goals, and sometimes those consequences are dire.
The problem is when we allow impending consequences to warp the way we collect and/or interpret our data, something which happens, in my experience, at an alarming frequency. Netflix, for example, has changed the definition of a video view multiple times in order to reframe its numbers. Watching 70% of a video used to count as a “view,” followed in 2019 by a cut-off of only 2 minutes. This resulted in their numbers looking way more appealing to investors, but it was also glaringly non-sensical as a measurement for meaningful growth and success for their platform. Fortunately, they course-corrected to something more useful in 2021, but that kind of outcome doesn’t always come for companies and teams that are under less public scrutiny.
How to check yourself
If you’re under a ton of pressure to perform, misrepresenting reality to yourself will only hurt you and your business. The better option is to lean more deeply into the truth. Surrender to what is, and see if there are other levers you can pull in order to keep your business, product, or team operational. Sure, maybe your engagement metric as currently defined isn’t what you hoped, but perhaps the data reveals something truthful that’s just as powerful a justification for continued investment as completely rewriting history.
For example, perhaps there are high-engagement trends in a highly active subset of your users that would justify a pivot to prioritizing their needs. Perhaps you’ve done an A/B test and discovered that investment in more of a specific content type would definitively achieve your engagement goals if all hands funnel the remaining energies around that effort. Perhaps the engagement metric you initially chose to prioritize is actually less meaningful than an entirely different metric that’s performing much better. There is any number of creative solutions and meaningful narratives to choose from that don’t involve intellectual dishonesty.
Am I clinging to a worldview or existing beliefs?
The next type of attachment is just as common as that to timelines and results: it’s the inability to manage the influences of your own worldview or existing beliefs on the framing of your data (otherwise known as confirmation bias).
For example, you may have a strength training app and believe your users are all fundamentally motivated toward gaining muscle mass. So you not only define your features toward that end but all the questions you ask are based on that assumption. You look for likely increases in muscle mass in the way users trend upwards in how much weight they can lift, in reported body fat percentage by a smart scale, and in other metrics that may indicate increases in muscle mass. As such, you ignore other metrics that may indicate other opportunities or successes—simply because it hasn’t occurred to you to pay attention.
And unfortunately, the negative impact can be dramatic. I’ve worked with apps that have discovered that as much as 40% of their users have identified a more important interest in using the product than that assumed when the app was first created, and I’ve seen product owners willfully deny that to be the case despite compelling data to the contrary.
This denial, as far as I can tell, comes from a few places: a belief that they “know their users,” a fixation on not changing the current vision, a fear of venturing into unknown territory, and a whole host of other ideas that make the product development experience feel riskier.
How to check yourself
The key to mitigating the impact of this attachment is, first and foremost, surrounding yourself with contributors from diverse backgrounds who offer diverse perspectives—even better if some of those people are ones you trust to actively challenge your blind spots. It’s very hard to notice an attachment like this on our own, and we often need a third party who’s not afraid to point it out repeatedly before we are willing to let it sink in.
On top of that, occasionally ask yourself whether a challenge to your existing beliefs makes you angry. If you feel upset or outraged or disgusted by the mere suggestion of an alternative perspective, go deeper and ask yourself why. There’s likely an attachment lurking that’s waiting to be found out.
Am I fortifying myself from blame?
While data can be your greatest ally for discovering the truth, it can also be used in a most pernicious way: to avoid taking responsibility for your product decisions.
This happens when you misunderstand what data really is. In this article, I explain that data never provides black-and-white answers, any amount of data requires interpretation, and that’s where the real magic happens. The point is data never says anything on its own or anything definitively. A thoughtful human is always required to make an evaluation as to what to do in response to the data.
But I’ve seen, time and time again, teams treat data like it’s some kind of reliable objective measure in itself and then hide behind the data when something goes wrong. The pattern involves collecting data, deciding it argues for a certain set of actions, taking those actions, watching them backfire, and then avoiding responsibility by declaring “I just followed the data.”
The problem is, of course, not that they tried to follow the data. It’s that they’re treating their initial interpretation of the data like it’s the objective equivalent of the data when in reality it’s something that needs to be updated since the interpretation led to failure. This insistence that the interpretation is something that it’s not, however, serves to protect the interpreter from perceived blame while also making re-evaluation seem unnecessary—and that typically leads to further failure.
How to check yourself
Unfortunately, this pattern of using data as a scapegoat manifests most in teams where experimentation is discouraged by a culture of shame and harsh consequences for mistakes. The best way to avoid this attachment is to frequently reaffirm to your team that experimentation is ok, that measured risk-taking is ok, and that there is no need to make data into a layer of protection against blame.
Am I inappropriately evaluating cost?
Of all the attachments in product development, attachments around cost are probably the most damaging to product success because they’re often framed as business-savvy and, therefore, typically go unchallenged for entire product lifecycles.
Before we get into this one, though, I want to be clear: Strategically managing cost is a critical aspect of business success. In no way am I suggesting that teams operate in a fantasy land where they pretend like funds are endless.
What I’m arguing against is very specific: an unhealthy preoccupation with cost that leads to maladaptive decisions and cultures of scarcity that make creative exploration (and, therefore, stepwise breakthroughs) impossible.
This can manifest as a profound orientation around scarcity and a habit of looking for evidence of threats to the resources currently available—even when those threats don’t exist. For instance, I’ve worked on teams where every micro cost needed to be justified, contractors were hired at the lowest possible rates, and abrupt product shifts were made away from efforts when the data didn’t “proven” its value.
The problem was the data never meaningfully pointed one way or the other, and “proving value” is not as logical as it’s often presented. Rather, an intellectually dishonest story about the data was used to support an already engrained belief that if we weren’t functioning with our heads barely above water, we were doing something dangerous. And this, ironically, suffocates creativity and leads to mediocrity and more often outright failure.
That said, hyperbolic austerity is just an extreme example of the way costs can warp our use of data. Sometimes the data suggests that we should take a path that involves a lot more effort or time to complete to deliver even better results, but product owners may underplay these data in favor of a less arduous path.
How to check yourself
The point is that cost is a very real constraint that must be managed carefully—but also artfully and with a clear mind. The way to do this is to evaluate the degree to which decisions are made as a reaction to a feeling of threat as opposed to a well-justified evaluation of risk. This starts by acknowledging risk is unavoidable and that failure is always on the table. Sitting with that reality, and emotionally allowing that to be, is counterintuitively the best way to free yourself from the fetters of attachment that cause you to deny reality on the basis of emotional discomfort. You won’t lose your drive or thirst for success. You’ll simply be able to see the path to success more clearly and be more resilient if things don’t go as planned.
About Joseph Pacheco
Joseph is the founder of App Boss, a knowledge source for idea people (with little or no tech background) to turn their apps into viable businesses. He’s been developing apps for almost as long as the App Store has existed—wearing every hat from full-time engineer to product manager, UX designer, founder, content creator, and technical co-founder. He’s also given technical interviews to 1,400 software engineers who have gone on to accept roles at Apple, Dropbox, Yelp, and other major Bay Area firms.
Gain insights into how best to convert, engage, and retain your users with Mixpanel’s powerful product analytics. Try it free.