9 important user engagement metrics for apps
User engagement metrics for apps help teams understand how much value users find in the app. Each user’s time is a limited resource and if users find enough value in the app to spend time there, app creators can monetize that experience. User engagement metrics measure the value of engaging users.
9 important user engagement metrics for apps
As the aphorism goes, what gets measured gets managed. When product teams discover the right user engagement metrics for their app, they can improve the app’s usage, utility, and profitability. Product teams can also better understand the needs of their users and cater to the app’s most valuable user groups.
To properly measure user engagement metrics, teams need an analytics tool. Most apps aren’t built to measure themselves and offer limited reporting. Only a product analytics platform can extract granular enough insight to measure true engagement.
For example, if app creators are only able to view their app’s total number of downloads in the app store, they can’t tell which users were retained and which churned. They can’t tell which users purchased and which didn’t, or which users open the app daily and which hardly ever touch it. Product analytics allows teams to drill down into the demographics of these users to view their behavior and measure their engagement.
Learn how Mixpanel can help you track user engagement metrics.Contact us
The top 9 user engagement metrics for apps
Active users (ADAU, MAU)
The best place to begin measuring user engagement is with the total number of active users. Not all users who download an app continue to use it. Statista reports that 24 percent of mobile apps are only used once. By calculating the total number of remaining active users, product teams can get a sense of how engaged their user base remains over time.
Activity is typically measured for a given time period. Average daily active users (ADAU), for example, is the average number of users active on any given day, averaged over one month. Many companies mistakenly confuse ADAU with daily active users (DAU), which is just the number of users on one specific day. When most entrepreneurs talk about DAU, they mean ADAU.
ADAU = total number of active users in a month / days in the average month
Monthly active users (MAU) is similar to ADAU but is the average number of active users in a given month, averaged over the months in a year.
MAU = total number of active users in a year / 12
Some apps, such as business tools, are only used during weekdays. These companies may choose to measure weekly active users (WAU).
Like many app metrics, user activity varies between businesses. The analytics team for a social media app might aspire to daily usage while the team for an enterprise software might settle for weekly usage. Measurements of activity like ADAU and MAU matter most to advertising-driven apps like news publications, which can’t measure exact revenue. If an app can measure direct sales the way an e-commerce app can, measuring activity is much less important.
Many app product teams want to know how addictive or “sticky” their app is among users. They can approximate this in a percentage by dividing ADAU by MAU.
Stickiness = average daily active users (ADAU) / monthly active users (MAU)
Financial news sites and investors commonly tout stickiness as a predictor of a product’s long-term prospects. The more users engage with the app, the more likely it is to become habit-forming. This gives the company more opportunities to monetize the app and cultivate a stronger, more loyal user base.
Session interval is the measure of the time in between app sessions. It examines whether users spread out their sessions or group them together. A music app, for example, mind find that the session intervals of its users are short during weekdays when they’re commuting but long during weekends when they’re entertaining themselves at home.
If a product team knows the session intervals of its users, it can discover what users are doing during that time and tweak the product to be more useful. The product team for the aforementioned music app might integrate the app into users’ home speaker systems to increase its at-home relevance.
Session interval = various analyses of session lengths
Session length is the inverse of session interval–-it measures how long user sessions last. Session length analysis often reveals that session lengths fall into distinct categories that connote different app use cases. A ride-hailing app, for example, might discover that session lengths are short during the day as business users travel between meetings, but long at night as they commute home. Users have different needs and interests in each type of session.They may not want to be bothered during the day but may be more interested in conversation and ride-sharing at night. The product team can tweak the app accordingly.
Session length = various analyses of session lengths
Time in-app is similar to session length except that it averages the time in-app for all users. it’s commonly represented as a number of hours per day, week, or month. The more time users spend in-app, the more it typically drives up revenue per user. Most product teams focus their time on increasing time spent in-app.
Time in-app = the average number of hours users spend in-app per day, week, or month
Screenflow is a visualization of users’ paths through the app’s various screens. It can show the journey of individuals, groups, or all users. By analyzing screenflow, product teams can look for common paths that users take through the app and then improve the experience accordingly.
One of the first places product teams should look for screenflow issues is user drop-offs. If a particular path or screen precedes a great number of users exiting the app, it’s often due to a bug or unclear design. If product teams can tweak the design, they can often increase time in-app.
Some product teams call the ideal user screenflow a golden path. Golden paths end in a desired outcome for both the user and brand such as a purchase, a signup, or an ad-click. Product teams can measure what percentage of users followed the golden path to completion and can tweak the app to increase it.
Golden path completion = the % of users that complete the golden path screenflow
Retention is a measure of how many users remained customers over a period of time such as five days, eight weeks, or one year. Retention is the opposite of churn, which is the percentage of customers that have quit the app. The retention equation is simple while defining and benchmarking retention is not.
Retention rate = (# of customers at end of period – # of customers acquired during period) / # of customers at start of period
The method by which app creators count their total number of users varies. For some apps, it’s quite clear: it’s the number of subscribed, paying customers. But for advertising-driven or freemium-model apps, it’s less obvious. Customers that don’t have a contract to cancel are free to use or not use the app. They may disappear for months at a time. These apps must define their number of active users as those that have used the app within a certain period of time, such as one week.
Within every app, there are successful actions that product teams want users to take. These range from downloading the app to signing up for a newsletter, enabling notifications, completing the welcome tour, adding a credit card, making a purchase, completing a profile, and more. Every product team can measure the percentage of users who complete each of these goals as conversions. If product teams increase their conversions, they increase user engagement.
Conversion rate = # of users who completed an event / total # of users
Customer satisfaction (CSAT)
To determine engagement, product teams can ask users about their experience. Users are eager to find value in the apps they use and if they aren’t finding any, they’ll often want to share their feedback.
There are several methodologies for determining customer satisfaction, the most popular of which is the net promoter score (NPS) system. To measure NPS, product and marketing teams must send a survey where they ask users to rate the app on a scale of 0-10, and to indicate whether they would recommend it to others.
Users that respond with 9-10 are considered promoters, 7-8 passives, and 0-6 detractors. Product teams can ignore the passives and subtract the percentage of detractors from the percentage of promoters. The result is a net promoter score.
NPS = % of promoters – % of detractors
Start measuring user engagement
Try Mixpanel for freeSign up