Retention curves that build themselves.
Doe queries your database, segments users by signup cohort, and models retention in a code sandbox. You get results broken down by segment with cohort-over-cohort trends. Change the retention definition and rerun in minutes. All SQL and sandbox code is visible.
Doe automates cohort retention analysis by querying your Postgres or MySQL database, segmenting users by signup cohort, and modeling retention curves in a code sandbox. Results show which cohorts are improving, which segments retain best, and what changed. Connects to Neon, Supabase, and PlanetScale. Change the retention definition and rerun in minutes. All SQL and code inspectable.
What changes
| Dimension | Before | With Doe |
|---|---|---|
| Analysis frequency | Quarterly, when someone asks | Monthly, delivered automatically |
| Definition changes | "Can you rerun with a different retention event?" adds two days | Change the definition, rerun in minutes |
| Segmentation | Each cut adds another half-day of SQL and formatting | Every segment computed in the same run |
| Traceability | A number in a slide with no source query attached | Every retention figure links to the SQL and code that produced it |
How Doe runs cohort retention analysis
Users grouped into weekly or monthly cohorts based on the event you define as first touch. SQL logged with row counts and execution time.
Doe calculates retention at every interval for each cohort. Logic is editable — adjust how reactivations or plan changes are handled and rerun without starting over.
Doe reruns the model across every dimension in your data and compares recent cohorts against older ones to surface whether retention is trending up or down.
Doe posts which cohorts improved, which segments retain best, and where the biggest shifts are — with the full cohort matrix and source queries attached.
Someone asks "what's our retention?" and nobody has the same answer.
Product says 82% from Amplitude (anyone who opened the app). Finance says 74% from Stripe churn (paying users who cancelled). The analyst's cohort query says 69%, but it references a table that was renamed in January and silently returns zero rows for anything after December. Three different numbers from three different definitions.
The deeper problem is the iteration cycle. Building a proper cohort analysis takes a full day. Then someone asks "can you re-cut by plan tier?" Another two hours. "By acquisition channel?" Another two. "What if retained means completing a core action, not just logging in?" Start over.
Get started in under 10 minutes
Connect your tools
One-click OAuth for each integration. No API keys, no engineering.
Describe what you need
“Build monthly retention cohorts based on signup week. A user counts as retained if they log in at least once in the period. Break it down by plan tier and acquisition channel.”
It runs on schedule
Updated findings land in your team channel on the second business day of each month.
Cohort Retention Analysis FAQ
You choose the retention event: a login, a product action, a transaction, or any event in your database. Doe applies that definition consistently across every cohort. You can define multiple retention metrics and Doe tracks both in the same analysis.
Stop doing the work your tools should do for you.
Set it up once. Doe runs it every time.