Mixpanel
7 analysis skills to teach your AI client (and the prompts behind them)
Analytics

7 analysis skills to teach your AI client (and the prompts behind them)

7 analysis skills to teach your AI client (and the prompts behind them)
Article details
Author picture
Nick Lin
Senior Product Marketing Manager @ Mixpanel
Last Edited:
Apr 23, 2026
Published:
Apr 22, 2026
A note on framing

The examples here focus on Claude and Claude Skills, which let you define, through simple prompting, reusable instructions for Claude to follow across sessions. Other AI clients have similar capabilities—ChatGPT's Custom GPTs work on a similar principle—so the underlying logic applies broadly, even if the setup differs.

A framework for prompts that actually work

Tip

You don't need all four elements to get a sharper answer—adding just one usually makes a meaningful difference. If a result comes back too broad, the missing ingredient is almost always population (which users?) or timeframe (when?).

Quick reference

Prompt vs. skill: what's the difference?

Prompts are how you get started. Skills are how your team scales.

Prompt Skill
What it is A one-time question you ask in the moment A standing instruction that shapes every future analysis automatically
Best for New questions, exploration, one-off investigations Recurring analysis patterns your team runs regularly
How you use it Type it fresh each time Save it once in Claude; it applies automatically going forward
What it produces One answer Consistently structured answers, every time
Analogy Asking a colleague a question Briefing a colleague on how to work with your team

Skill 1: Understand your data model

List all events and properties in this project, grouped by category. Flag any that haven't fired in the last 90 days.
For all my analyses, only reference events and properties that are standardized—used in at least two other reports and verified by the data team. If you’re unsure whether an event meets that bar, flag it rather than using it.
Tip

Run the data model orientation prompt at the start of any analysis sprint—before you get into funnels or retention. Knowing which events are stale or unverified upfront prevents an entire analysis from being built on unreliable signals.

Skill 2: Diagnose conversion and drop-off

Where are new mobile users dropping off in our onboarding funnel in the last 7 days? Show me a step-by-step breakdown.
For every funnel analysis, automatically break results down by platform, acquisition channel, and user cohort. Don’t wait for me to ask—include these dimensions by default.

Skill 3: Find what's driving growth

Which acquisition channels drove the most activated users in the last 90 days? Rank them in a table.
When analyzing acquisition channels, always rank by 30- and 90-day retention, not volume. Channels that drive high traffic but low retention should be flagged, not celebrated.

For more on how product teams are using MCP in practice, see how Mixpanel uses its own MCP server and what makes a good analytics prompt.

Skill 4: Connect behavior to retention

Which in-app behaviors are most strongly correlated with 30-day retention for users who signed up in the last 60 days?
Always separate retention analysis between free and paid users and run them individually. Never combine them into a single aggregate unless I explicitly ask you to.

Skill 5: Trace individual user journeys

Why did user 12345 submit negative feedback on April 23, 2026? Summarize what you find.
When investigating individual users, always pull from all available signals: event activity, session replays, feature flags, support tickets, and any frustration indicators. Summarize findings in order of most likely relevance to the issue.

Skill 6: Govern your data at scale

We just added the high-value purchase event. Can you write a description for it, using our existing documented events as a guide?
When writing or updating event descriptions, use existing documented events to infer naming conventions and intent. Highlight any events that may be duplicates of existing ones. Always show me a preview of proposed changes before applying them.
Tip

When using MCP for governance at scale, share your naming conventions up front—paste a few examples of well-documented events at the start of the session. This lets MCP infer your patterns rather than guess from scratch, and the descriptions it writes will be far more consistent.

Skill 7: Build reporting that runs itself

Generate a product performance report for all active users covering the last 7 days. Format it as a Slack update with key metrics, trends, and one recommended action.
For my weekly product report, always include: top-line activation and retention metrics, the biggest week-over-week movement (up or down), one segment that’s behaving differently from the average, and a single recommended action. Keep it to five bullets or fewer.

The real unlock: Chaining prompts into workflows

01 > Which acquisition channels drove the most activated users in the last 90 days? Rank by 30-day retention.
02 > Break down the lowest-retention channel by platform and user cohort.
03 > Pull session replays for users from that channel who churned within 7 days and summarize common friction points.

Summary reference

7 skills at a glance

Each skill starts with a prompt. The instruction turns it into something that scales.

Skill Starter prompt Skill instruction
Understand your data model List all events and properties, grouped by category. Flag any that haven't fired in 90 days. Only reference events used in at least two reports and verified by the data team. Flag unknowns rather than using them.
Diagnose conversion and drop-off Where are new mobile users dropping off in our onboarding funnel in the last 7 days? Show me a step-by-step breakdown. For every funnel analysis, automatically break results down by platform, acquisition channel, and user cohort.
Find what's driving growth Which acquisition channels drove the most activated users in the last 90 days? Rank them in a table. Always rank acquisition channels by 30-day retention, not volume. Flag channels with high traffic but low retention.
Connect behavior to retention Which in-app behaviors are most strongly correlated with 30-day retention for users who signed up in the last 60 days? Always separate retention analysis between free and paid users. Never combine them into a single aggregate.
Trace individual user journeys Why did user [ID] submit negative feedback on [date]? Summarize what you find. Always pull from all available signals: event activity, session replays, feature flags, support tickets, and frustration indicators.
Govern your data at scale We just added the [XYZ] event. Write a description for it using our existing documented events as a guide. Use existing events to infer naming conventions. Flag potential duplicates. Always show a preview before applying changes.
Build reporting that runs itself Generate a product performance report for all active users covering the last 7 days. Format it as a Slack update. Always include top-line activation and retention, the biggest week-over-week movement, one outlier segment, and one recommended action. Five bullets max.

Mixpanel's MCP server is available now. Start with a prompt. Build toward a skill.

Share article
Nick Lin
Nick Lin
Senior Product Marketing Manager @ Mixpanel