Max Levchin on Trading Privacy for Value and the Quantified Self
Max Levchin on the Quantified Self and “Value for Privacy”

Max Levchin on the Quantified Self and “Value for Privacy”

Last edited: Feb 25, 2022
Amelia Salyers

Max Levchin has a longstanding reputation in the Valley for being data-obsessed, from his entrepreneurial ventures to his intense focus on athletic hobbies like cycling. At one point during our interviews, I asked Max where his mania for data comes from.

“I’ve just always been that way,” he said. “I guess I’m just naturally very competitive. I tend to actively look for things where I get to find out just how good am I: am I better? Can I do better?”

Indeed, he’s been obsessed with self-improvement for most of his life, and that self-improvement comes from a careful attention to the data: “Improvement can only be done by measurement and refining. Basically, that is is how I tend to improve my cycling time or whatever it is that I am trying to do. As you get there, you naturally start tracking everything.”

“Tracking everything” about yourself might have sounded crazy or obsessive a few years ago, but now it feels as normal as the FitBit on your dad’s wrist. From personal fitness to finance, health, and more, the so-called “quantified self” is moving from a few Silicon Valley nerds’ DIY project to a booming industry that has giants like Apple, Google, and others involved. Instead of spreadsheets or a bunch of hacks, simple apps and wearables are making it easy for anyone to consume large quantities of data about themselves.

And yet, the larger the quantities of data we collect, the larger the threat to personal privacy seems to grow. How do we as consumers access a wealth of potentially life-changing information about ourselves without giving away secrets about who we are? What really matters when it comes to privacy?

Given Max’s immersion in all things data for most of his life, he’s given a lot of thought to this particular issue. While there is plenty of nuance, in his mind, it will all likely boil down to some kind of tradeoff based on the simple notion of “value for privacy”: how much privacy are we willing to give up to unlock the potentially enormous value in our data? Or, put another way, how much insight into ourselves and our world are we willing to forego?

data privacy

Setting (and obsessively plotting) the curve

To understand Max’s relationship to this issue, you first have to understand Max’s relationship  to data and how he uses it in his own life, which means, as we’ve noted, how he uses it obsessively.

In discussing his accretion of data on his cycling habit, Max says, “One thing leads to the other. You start asking, what shouldn’t I be tracking? What if one day I want to improve something, even if I’m not in a mood to improve it now? Might as well track that, too. There’s kind of an attitude of glut.”

Max’s brand of competitiveness reminds me of close friends who do endurance races like Ironmans or ultra-marathons. These athletes are certainly highly competitive with each other, but oftentimes their biggest motivation is to beat their own times, improve their own splits. If you look at Max’s life and career, you begin to see all the ways in which he’s always tried to beat his own times, incrementally improving his splits until he leaves the competition in the dust.

Born in 1975 in Soviet-controlled Kiev, Ukraine, he and his family emigrated to Chicago when he was 16. Here, Max discovered cycling and computer science when he enrolled in the University of Illinois at Urbana-Champaign. This is where his obsession with data began, as well as his lifelong desire to found companies.

In the summer of 1995, he and a couple of college friends, Luke Nosek and Scott Banister, founded SponsorNet New Media, an ads business. Though the company didn’t go very far, it gave Max and his co-founders a taste of the entrepreneurial life. With just an idea, some code, data, and a lot of hard work, they could build a company from scratch.

Soon enough, Max heard the siren song of Silicon Valley, and after graduating in 1997, he followed the American mandate to start over again and go west, young man. On the verdant fringes of Stanford University in Palo Alto, California, he met another data-driven striver, Peter Thiel. Before long, the two joined forces and started the company that eventually became PayPal.

Max’s involvement in PayPal is notable for many reasons (such as how his experience with fraud detection set him up to take on consumer finance today), but perhaps one of the most underrated was his first-hand experience with data security and privacy for consumers. PayPal collected massive amounts of sensitive consumer data, and Max and his team had to constantly balance using that data to build better algorithms with preserving privacy.

Fifteen years later, though, our appetite for privacy over access to data is perhaps shifting. For example, PayPal’s subsidiary Venmo, the payments app more popular with younger consumers, has been very successful largely by allowing its users to publicly post what they’re paying their friends for, often with emojis and comments from others — it’s as if an older generation had started publishing their checkbooks, minus the dollar amounts.

As many of the things Max has been doing in his businesses and on his own — tracking every aspect of his cycling, experimenting with nutrition — go mainstream and grow beyond fitness and finance, the debate over privacy and the quantified self will only grow more heated.

Screen Shot 2016-05-16 at 6.11.26 PM

The revolution of the self will be quantified

One of the reasons for the current explosion in quantified self apps, businesses, and services is that the technology itself has become consumer-friendly and easier to connect.

“For a long a long time, the availability, collection, processing, and organization of data was hard,” Max says. “You couldn’t really do any of the necessary work without being at some level an expert on data. Not quite a programmer, but somewhere on the spectrum toward that.”

In fact, in things like that FitBit, it wasn’t the gyroscope measuring movements that was the hard part — it was how to connect and process all of the data. That gyroscope is connected to a machine learning model that sorts through a ton of data to find and map steps, which then uses Bluetooth to connect back to an app, which then gives you a percentage against a goal. In devices like a Fitbit, Max says, “There’s a tremendous amount of very low level data gathering, which is completely simplified down to ‘Did you take a thousand steps today?’”

He continues, “As this stuff gets monetized, or the interface on top becomes really user-friendly and really clean, then I’m sure it is going to become more and more commonplace. Among other things, people aren’t going to be thinking so much ‘I’m a data junkie’, but instead they will say ‘I’m a personal improvement junkie’ or ‘I’m a fitness fanatic.’

Max is betting big in this area. His umbrella organization, HVF Labs which stands for “Hard. Valuable. Fun.” is focused on “data-as-commodity” products. In addition to Affirm, their other big success to date has been Glow, a suite of fertility apps that use data science to help women track everything from their periods to pregnancy and even the first year of their baby’s life. In fact, Glow reports that it’s already used by 3 million women who have attributed 200,000 pregnancies to the apps’ data-driven approach to fertility.

Yet, all of this empowerment and excitement around using data in our everyday lives has a catch: the risk to privacy. The more data we hand over about ourselves to companies, the more at risk our personal lives are to unscrupulous groups, both outside and inside of those companies.

There is no clear right and plenty of well-known wrongs”, Max says of privacy concerns. “I think what is ultimately going to matter is the notion of value for data, and for a lot of these large systems, it will be value for privacy, if you will.”

He draws the analogy of a possible “smart mattress” company: “Soon enough, my mattress will know how well I sleep, and it will also know exactly what my bedroom life looks like. The company behind it, if we’re not careful, could blackmail me. I think that there will undoubtedly be a semi-scandalous situation where people will find out that their data was used against them in a way they did not anticipate, and probably people will start gauging what the trade-offs are.” (In fact, it’s already happening in the world of academia: a group of Danish researchers recently came under fire for releasing data on 70,000 users of the dating site OkCupid.)

But, while Max acknowledges that this pessimistic view of the risks to privacy has validity, he believes more strongly in another tenet of humanity: our adaptability. “In most situations, though, we typically, as humans, are very adaptable. We adjust our behavior, we adjust our expectations as soon as we know what is happening around us. We learn very quickly.” And we adjust most quickly when we get more concrete value than abstract risk.

“People are sort of realizing, here’s the scary version of this story, but there is also a lot of value. My mattress story can be sold in a very scary way, but the flip side is, if I’m having a hard time sleeping, or if I’ve got sleep apnea, I don’t know what to do about it or when it happens. Having a bunch of sensors in my mattress and something that actually nudges me a little bit to to stop snoring, that could be a lifesaver. Then, it’s positive to me.” It gets back to that idea of “value for privacy”.

“Nobody wants to give up their privacy, even incrementally, for vanity purposes, but if you’re giving it up and realize that you’re giving up 10 pounds of unwanted fat, or regaining three hours of sleep, then I think people will just figure out how to cope with it and move on.”

Just as in his PayPal days, the real issue Max sees is not in data collection but in data security: “Everything that can be collected will eventually be collected. Two things, then, are essential. One, data that is collected needs to be secured, and there need to be some assurances around how secure it is. And two by far the most important thing is there needs to be transparency in what is being collected [and] from whom, so that a person can find out what is known about them, what has been recorded about them, even protectively.”

'Everything that can be collected will be collected." - Max Levchin, CEO, Affirm

In parallel with innovations that expose and connect new streams of data, companies and consumers need to push for innovations in data security and documentation on how data gets used by businesses.

As Max sees it, “[Consumers] don’t just have the right, they have the obligation to know what is happening to their privacy.”

A Model of Trust  

Ever the entrepreneur, Max also foresees new industries springing up to help consumers manage their data online. “I think there will be opportunities for companies to build things like personal vaults. As we get to infinite bandwidth or viral latency, there will also be opportunities to get a magical set-up where only systems I authorize, for a time, have access to my data in my personal vault.”

Still, the biggest factor here is how companies build and maintain trust with their customers.  It’s one of the reasons he started Affirm: after the 2008 financial crisis, many Millennials and others lost trust in the big banks, and he saw an opportunity to create a new kind of bank that tried to cultivate trust, not fear, with its customers.

He sees that trust modeled already in our everyday lives. To return to the smart mattress example, “At some point, I don’t really care if the company knows what happens in my bedroom. My doctor knows what happens in my bedroom, with great detail, more than any of my friends, and yet, I’m very comfortable with it. I know they are bound by the Hippocratic oath or just a sense of decency not to reveal my secrets.” The list extends to lawyers, accountants, therapists, and so on.

At the risk of oversimplifying something incredibly complicated, he’s pointing out that we already engage in many business interactions and relationships built on trust. “There’s plenty of precedence for human relationships where trust is essential. It’s more about understanding what you’ve revealed than what it can be used for, how it can be used rather than the actual act of revealing,” Max says.

While the worst-case stories of trust misplaced regularly make the rounds on Reddit and fuel our favorite TV dramas, our lives are still filled with examples of how we trust people and companies with our more important information and nothing bad happens. Indeed, our society wouldn’t be able to function without trust. And, though concerns about data security and unscrupulous people are entirely valid and useful, Max seems to be saying that the potential for good in unlocking our data with companies is much bigger.

Data + Honesty = Trust

The quantified self may or may not be our brave new world, but honesty will increasingly be the most valuable asset a company can have. To build trust in our data and the companies that handle it for us, though, we need one critical component: education.

“Teaching people how to trust data involves us teaching them how to understand data,” Max told me while we discussed Affirm, but the idea is relevant for any endeavor or company utilizing user data to provide a service. Whether at Affirm, Glow, or other projects, Max is building companies that try to deeply empathize with their customers, using data and honesty to build trust.  

For someone who has wrestled with some of the worst of humanity while fighting fraud (ask him about the real mafiosos in Eastern Europe), Max is still optimistic that data will ultimately help us shape a healthier, wealthier, and better future. His drive to continuously measure, refine, improve is ultimately bent towards creating more good in the world. And his experience with technology over the last 20 years leads him to believe that the majority of us are trying to create more good, too.  

Max Levchin on Trading Privacy for Value

Get the latest from Mixpanel
This field is required.