Ask an expert: Anubhav Shrivastava of Viacom18 Media on how data sophistication feeds a cycle of user value and growth - Mixpanel
Product Analytics

Ask an expert: Anubhav Shrivastava of Viacom18 Media on how data sophistication feeds a cycle of user value and growth

Alana Tees Content & Communications @ Mixpanel

As Head of Data Science, Digital Ventures at Viacom18 Media—a Mumbai-based joint venture between ViacomCBS and Network18 and one of India’s fastest-growing entertainment networks—Anubhav Shrivastava has led the company’s shift toward data-informed decision making.

We sat down with Anubhav to learn:

  • Tips for achieving data sophistication
  • The surprising viewership trends associated with Covid-19
  • What makes a good PM vs. a great PM

 

Can you tell us a little bit about the company and product, and where you’re focused?


I work with the Digital Ventures arm of Viacom18, one of India’s media powerhouses. Viacom18 proactively forayed into the OTT business in late 2015 and went live with Voot—our mobile-first video-on-demand platform—in mid-2016.

We have three distinct products: the ad-supported video streaming platform, which is what we started with four years ago; a subscription version of the business with ad-free content; and exclusive original content. We also have a kid’s product, with ebooks, video games, audio stories, videos, movies, etc.

We’re constantly focused on evolving the user experience and expanding our reach to newer formats such as set-top boxes, connected TV apps, etc. Before going digital, Viacom18 took a lot of pride in the fact that as a broadcaster we reached more local homes, with more diverse stories, than broadcasters could achieve anywhere else. Today, we take additional pride in the fact that we’re aiming to provide our entertainment to every connected user in the country.

 

What comprises a typical day as Head of Data Science at an OTT company? 


They say the day never ends in a media company. It’s true. Typically, my day starts with a quick scan of the previous day’s data on performance across metrics, and a search for anomalies. The rest of the day is spent troubleshooting, reviewing ongoing A/B test stats, discussing the progress of tactical and strategic projects and product development, struggling to find time to contribute individually to areas of interest, contributing just-in-time to some issues, and experiencing the guilt of missing out or ignoring some others—all of it to accommodate priorities. On the bright side, all of our meetings are data-driven and progress is measured by data outcomes.

Spending time judiciously is important. Therefore, one of our key focus areas has been to promote the culture of self-driven data access and analysis across teams. Mixpanel data is at the core of this intent and its dashboards earn a seat in most meetings. On our teams, most people are, irrespective of professional background, hands-on in using Mixpanel.

When I can, before calling it a night, I try to contribute to more strategic efforts—such as refining our data management platform or working on a new machine learning application.

 

It sounds like you have a very data-informed approach to product prioritization. How do you keep data at the forefront?


Data-driven product prioritization is evolving quickly. When we were conceptualizing the online streaming business five years ago, Viacom18 was a traditional broadcast media organization; we had embraced the digital medium as an advertising and marketing tool, but we hadn’t jumped on that bandwagon as a publisher. Traditional media had always been data-driven, but data sources were mostly offline channels, with periodic data. Culturally, decisions were strongly based on hunches, gut feelings and early indicator deductions.

Digital media flipped this paradigm. We don’t have to wait a week to get television ratings. We can analyze user affinity in real-time—perhaps the biggest game-changer in terms of how data has changed the way content will be created in the future.

Data is also a strong enabler in decision-making, especially when key voices in a room have different views on which path to take.

Today, we dabble in a multitude of products. Prioritization happens at all levels: among products; within products; and at the level of features, backlog, bug fixes, etc. A/B testing smoothes decision making, as do all the real-time user data that we get from past implementations. Each decision impacts a segment of users. The call is always what percentage of users will be impacted, what’s urgent, and what’s a priority—all backed by data. The important thing is to ensure parity among users across products. Differences exist, both in approaches and expectations (e.g., a product made for kids has to be viewed through a kids’ lens).

 

With your focus on the data as a single source of truth, have there been instances where the data told a story that caused a change in the product strategy?


This happens often, and thankfully so. In the most recent example, we launched our subscription product in February 2020. When COVID led to a complete lockdown soon after, all our marketing and content release plans were thrown into chaos: out-of-home advertising lost its value, and most to-be-released content needed final touches that lockdown prevented.

But then, data started showing us that housebound users were much more inclined to sign up for the subscription service than we had ever imagined. The trend continued to swing up and up with each passing day. Some of the just-in-time originals we dropped on the platform did help in swinging things our way, and satisfying content hunger for these new users was a challenge. We adopted several strategies, with data patterns indicating that resurfacing some of our library cult content was picking up steam. And there it was: our winning solution for turning the tide during those difficult times.

 

What specific metrics do you use to track the health of the product? What’s the one thing you want users to do?


Metrics evolve with the tech landscape, user habits, and monetization models. We’re careful to separate metrics on business health from those on product health. From a product perspective, the key metric is user engagement, for new and returning users. Each functional team, be it product or tech or marketing, has its one unique topline metric for user engagement and then independent sub-level metrics at the level of each team member that contributes to achieving the overall health metric.

We want anyone who opens the app to find something interesting to watch. And then to come back again, time permitting, to watch more content. That is engagement for us. Product and tech focus on making that experience as seamless and enriched as possible and other teams focus on the content interaction.

 

How do you communicate these metrics across the organization and make them top of mind for stakeholders?


Essentially, we’ve created a culture in which everyone is data-driven in terms of their own metrics. For example, someone in publishing could track how many clicks or conversions happen for them every day. For marketing, it could be cost per install—for example, “What’s the lifetime value that you’re generating out of a user?” (something we capture with Mixpanel in terms of videos and ads that they have watched). Someone in product could measure engagement time—“What is the percentage of users that are consuming the top five features that you’ve launched this year?” Or someone in content would measure the completion rates of their recently launched show. This also guides individuals in defining their goals: something they can achieve. That’s the culture that data helps propagate within the organization.

 

How is the way you track metrics different in your industry versus other industries? How does Viacom18 differ from others in the OTT space?


Tracking started with e-commerce and was primarily focused on conversion funnels. But in the video streaming space, users aren’t transacting every day. It’s important that they be able to find something they want to watch and, aided by the product features, have a good experience watching that content. When we started, we tried tracking the touchpoints that paved the way, enabled, or interfered with the user’s ability to watch content in each session. Then we built additive strategies on that. Being a pioneer in this arena, we didn’t have many reference points, but we were able to put it all together with hit and miss trials. We still follow this method; we start by categorizing all user touchpoints as either enablers or disablers for the viewing experience and then devise data-led strategies to improve user experience.

 

So as the analytics space becomes more sophisticated, you’re changing and expanding your tracking plan?


Changing, yes. But expanding? Not really. Tracking everything isn’t the best approach.

We track only what would give us the necessary knowledge about what worked and how we can make it better for the user.

We’re aware that, whenever we launch a feature, or every time we’re tracking something, we’re also contributing to global warming. So every quarter or every two months, we look at the data we track and don’t consume. For example, we might’ve launched an A/B test to track something, but once we’ve decided the test’s winner, further tracking is unnecessary and just contributes to data that we won’t use. We classify that as dark data and discontinue its tracking. One of my favorite features within Mixpanel is Lexicon, which lets us store descriptions of events and their properties. By removing what’s no longer useful, it helps us keep our interface very clean.

 

Startups or businesses that are trying to become data-led often track too much off the bat and wind up with a lot of data that’s not actionable. What’s your advice on the right balance for a company that’s trying to achieve data sophistication?


Think about it like this: when you envision a product feature, you write a user story around it, right? What is it that you want to achieve when this feature goes to market? Once you do that, you add tracking to build a funnel around it. I would track two things: first, the key steps toward conversion (and maybe one intermediate step); and, second, whether this new feature could adversely impact existing user behavior. Tracking should be a very thoughtful and involved process. You should spend a lot of time ideating and deciding what to track, how to track it, and how to measure it.

 

Customer churn can be a problem in a price-conscious market like yours. Can you speak to some of the retention strategies that you employ?


Retention is the new acquisition. Because losing an acquired user is a colossal loss, we’ve built our own lifetime value model. We feed all our data on user content consumption, install attribution, and ads viewership, along with our subscriber behavior, into one data warehouse and send it back to the attribution tool. Then we can decide which user segments we should definitely re-target, which we should perhaps re-target less, and which we shouldn’t re-target at all. Mixpanel is central to that strategy, because it stores all the user interaction data that feeds into feature engineering.

In a similar vein, we need to know which content and what ads to show to which users. We look at ad consumption data and see how different user segments respond to ad intensity or consumption or content—or even to product-marketing push notifications. That’ll allow us to build user personas that we can use with a recommendation engine to guide us on what kind of (and how many) ads we should show. Learning from such strategies help us plan and build AI-driven interventions that circumvent user issues and carry forward to channel monetization strategies.

 

Finally, what’s the difference between a good PM and a great PM?


You should follow your instincts. If you’re trying to build something new, you won’t find much-existing evidence for defending your decisions, so it’s always going to be your call versus somebody else’s. That said, you should always seek empirical evidence to prove yourself wrong while you’re making those decisions. If what you tried didn’t work, you should learn from it and move on swiftly. Remember that failure happens to everyone. It’s important to stop doing things when they don’t work well, instead of sticking to the past or having just one line of thought. If you’re stuck in a time capsule, you’ll stunt the product’s evolution.

 

 

Get the latest from Mixpanel
This field is required.