Duolingo Onboarding Teardown: 7 A/B Tests Behind Their 9% Conversion Rate (2026)

Case Study Relaunch Team · April 11, 2026 · 8 min read

Duolingo converts 8.9% of monthly active users to paid subscribers — in a market where 2% is average and 4% is considered elite. That's not a rounding error. It's 10.3 million paying users on a base of 116M+ MAUs, generating $607.5M in subscription revenue in FY2024 alone.

Every growth blog has a post about Duolingo's onboarding. They all describe the same things: the mascot, the streaks, the delayed sign-up wall. What none of them explain is that every one of those features was discovered through controlled experiments — and that Duolingo kills anything that boosts revenue at the cost of engagement, including features that work.

This post covers the specific A/B tests, their measured results, and the counterintuitive decisions that make Duolingo's funnel actually work.

TL;DR

  • Duolingo's delayed sign-up test increased DAUs by 20% — but only when paired with a specific soft-wall → hard-wall sequence that primes user commitment
  • Free-to-paid conversion grew from 3% to 8.9% over five years through compounding small experiments (~2-3% lift each), not any single design change
  • Duolingo killed a feature that increased revenue because it decreased daily active users — proving their real north star is engagement, not conversion
  • 300+ experiments per quarter is the actual moat — most teams run 2-5 per quarter and wonder why copying Duolingo's UI doesn't produce Duolingo's results
  • The transferable lesson: you can't copy outputs without copying the process that discovered them

The Setup: Duolingo and Why Their Onboarding Is Worth Studying

Duolingo crossed 50M daily active users in Q3 2025, up 36% year-over-year. Paid subscribers hit 10.3M in Q1 2025 — a 40% jump. ARPU grew 7% as users migrated to Duolingo Max (AI-powered features) and family plans.

But the number that matters most for this teardown is the MAU-to-paid conversion rate: 8.9%. In 2020, it was 3%. That 176% increase didn't come from a redesign or a pricing change. It came from running roughly 1,200 experiments per year and compounding hundreds of small wins.

8.9%
MAU-to-paid conversion (vs. 2% industry avg)

Most onboarding teardowns treat Duolingo's funnel like a design artifact — something you can screenshot and replicate. It's not. It's a testing artifact: the current state of thousands of experiments, most of which individually moved metrics by single-digit percentages.

What We're Looking At: A Walk-Through

Duolingo's onboarding doesn't follow a traditional SaaS funnel. Instead of landing page → sign-up → product, the sequence is:

  1. Goal selection (why are you learning?)
  2. Language picker + proficiency assessment
  3. First lesson (before any sign-up)
  4. Soft walls (optional sign-up prompts after value delivery)
  5. Hard wall (forced account creation after investment)
  6. Free trial + monetization prompts (reverse trial of Super Duolingo)

The critical insight: users complete a lesson before they create an account. This is not standard practice — and it wasn't the original design. It was an experiment result.

Element 1: Delayed Sign-Up

The most cited feature in Duolingo teardowns is also the most misunderstood. Moving the sign-up screen behind the first lesson increased DAUs by 20%.

"People thought we were crazy — why would you let users in without capturing their email? But the data was unambiguous." — Duolingo growth team, via First Round Review

This works because of the endowed progress effect: once users invest effort (completing a lesson), abandoning that progress feels like a loss. The sign-up wall converts better after investment than before it.

Element 2: Soft Wall → Hard Wall Sequence

This is the part nobody talks about. Duolingo doesn't just delay sign-up — it uses a two-stage commitment sequence:

Wall Type What It Does When It Appears User Choice
Soft wall Suggests sign-up to save progress After first lesson Can dismiss
Hard wall Requires account creation After 2-3 lessons Must sign up

The sequence matters. Testing showed that soft walls followed by hard walls drove an 8.2% DAU increase. But critically: neither wall type performed as well in isolation.

Soft walls prime users for hard walls. The optional prompt reframes sign-up as "saving your progress" rather than "giving us your email." By the time the hard wall appears, the user has already mentally committed.

Element 3: Personalization Before Sign-Up

Before you create an account — before you even start a lesson — Duolingo asks you why you're learning, which language, and how much time you want to spend daily.

This does two things:

  • Increases perceived value by customizing the experience before any commitment
  • Creates sunk cost through micro-decisions that feel like investment

Element 4: Gamification as Retention Architecture

Streaks, leaderboards, and badges aren't decoration. Each was tested independently, and each has measured impact:

  • Optimized notification copy: +5% DAU
  • Weekend Amulet (streak protection): D7 retention +2.1%, D14 +4%
  • Growth-mindset coach copy: D14 retention +7.2%
  • Badges v1: session starts +4.1%, completions +4.5%, friend adds +116%

None of these individually look like a breakthrough. Compounded across 1,200+ annual experiments, they're the entire business.

Element 5: Reverse Trial Monetization

Duolingo gives new users 14 days of Super Duolingo for free, then converts them to paid. This is the opposite of most freemium funnels, where you start free and upsell later.

The reverse trial works because users experience the premium product before they experience the free one. Losing ad-free lessons and unlimited hearts feels worse than never having them.

4 Things Duolingo Gets Right

1. Engagement Is the North Star, Not Conversion

This is the single most important thing Duolingo does that competitors don't copy.

Duolingo once tested promoting offline lessons — it increased subscription signups but decreased DAUs. They killed it. Revenue up, engagement down = discontinued.

Most growth teams would celebrate a revenue increase and ship it. Duolingo's framework says: if it hurts daily engagement, it doesn't ship, regardless of revenue impact. This is why their paid conversion grows sustainably instead of spiking and churning.

2. They Manage States, Not Stages

Duolingo doesn't optimize a linear funnel. They manage users across six behavioral states: new, current, at-risk, dormant, reactivated, and resurrected. The growth team's job is to optimize transition rates between states.

This means they're not just asking "how do we convert more free users?" — they're asking "which state transition compounds DAU most efficiently?" That's a fundamentally different optimization problem.

3. Experiment Velocity Is the Moat

Duolingo runs 300+ experiments per quarter company-wide, with 20-80+ running simultaneously at any given time. The growth team alone runs 5-8 in parallel.

The average SaaS team runs 2-5 experiments per quarter. Duolingo runs 60x more. Even if each test produces a modest +2-3% lift, the compounding math is brutal for competitors who can't match that cadence.

Relaunch's autonomous CRO agents run continuous experiments around the clock — so teams running 2-5 tests per quarter can start compounding results at Duolingo-scale velocity.

Close the experiment velocity gap →

4. Sequential Priming Over Isolated Optimization

Every major onboarding element — delayed sign-up, soft walls, hard walls, reverse trial — was tested as part of a sequence, not in isolation. The order matters as much as the elements themselves.

3 Conversion Leaks We'd Fix

1. The Web-to-App Handoff Is Rough

Duolingo's web landing page pushes users to download the mobile app. The transition loses context — users who started a lesson on web don't seamlessly continue on mobile. This is a measurable drop-off point that a deep-link with session state would fix.

2. Pricing Page Anchoring Is Weak

The pricing page shows monthly vs. annual plans, but the anchoring is subtle. Showing the per-day cost (e.g., "$0.37/day") next to a coffee emoji — a tactic Headspace and Calm use effectively — could increase annual plan selection by an estimated 10-20%.

3. Social Proof Is Buried

Duolingo has 50M+ daily active users and plenty of success stories, but social proof appears late in the funnel. Moving specific learner outcomes ("Maria became conversational in Portuguese in 4 months") above the fold on the pricing page follows the principle that specificity converts better than scale.

5 Experiments an AI CRO Agent Would Run

This is what autonomous experimentation agents would generate if pointed at Duolingo's funnel:

Experiment 1: Soft Wall Timing Optimization

  • Hypothesis: Moving the first soft wall from post-lesson-1 to mid-lesson-1 (after the first correct answer) will increase sign-up rate because the user has proven competence but hasn't yet received full closure
  • Expected impact: +5-12% on soft wall sign-up conversion
  • Segment differences: New-to-language-learning users likely respond more than experienced learners who are less impressed by early success

Experiment 2: Dynamic Hard Wall Threshold

  • Hypothesis: Personalizing when the hard wall appears based on engagement signals (lesson completion speed, streak of correct answers) will outperform a fixed trigger point
  • Expected impact: +3-8% DAU retention through reduced premature churn
  • Segment differences: Power users tolerate later hard walls; casual users need earlier commitment points

Experiment 3: Reverse Trial Length by Segment

  • Hypothesis: Shortening the reverse trial from 14 days to 7 days for high-engagement users (3+ lessons/day) will increase conversion without hurting retention, because these users have already experienced enough value
  • Expected impact: +8-15% paid conversion among high-engagement segment
  • Segment differences: Low-engagement users (≤1 lesson/day) likely need the full 14 days

Experiment 4: Loss-Framed Push Notifications

  • Hypothesis: Replacing "Continue your lesson!" with "You'll lose your 3-day streak in 4 hours" will increase notification-driven session starts because loss aversion outperforms positive framing by 2-3x in behavioral research
  • Expected impact: +4-9% notification click-through rate
  • Segment differences: Users with streaks >7 days respond most to loss framing

Experiment 5: Social Proof on Pricing Page

  • Hypothesis: Adding "47,000 users upgraded to Super Duolingo this week" as a dynamic counter on the pricing page will increase paid conversion through social proof at the decision point
  • Expected impact: +6-12% pricing page conversion
  • Segment differences: Users from referral sources respond most to social proof; organic users respond more to feature comparison

6 Lessons You Can Steal for Your Own Onboarding

  • Delay sign-up until after the user has invested effort. Let them experience your product's core value loop before asking for an email. The endowed progress effect makes the sign-up wall convert 20%+ better after investment than before.

  • Use soft walls to prime hard walls. An optional sign-up prompt reframes the required one. Test the sequence, not just the individual screens.

  • Kill features that boost revenue but hurt engagement. If a change increases paid conversion but decreases daily usage, it's borrowing from your future. Engagement compounds; short-term conversion doesn't.

  • Measure state transitions, not just funnel stages. Map your users into behavioral states (new, active, at-risk, dormant) and optimize the transitions that compound your north star metric most efficiently.

  • Compound small experiments instead of chasing big wins. A +2% lift per test across 50 tests/quarter produces dramatically better results than one big redesign. The math favors velocity over ambition. Tools like Relaunch.ai's autonomous CRO agents exist specifically to close the experiment velocity gap — running continuous tests without requiring a 100-person growth team.

  • Test sequences, not elements. The order of your onboarding steps matters as much as the steps themselves. A/B test the arrangement, not just the content.

Frequently Asked Questions

What makes Duolingo's onboarding so effective at converting free users?

Duolingo's onboarding converts at 8.9% (vs. 2% industry average) because it delays sign-up until after value delivery, uses a soft-wall → hard-wall commitment sequence, and treats engagement — not conversion — as the north star. Every element was discovered through controlled experiments, not designed by intuition.

What is a good free-to-paid conversion rate for apps in 2026?

The industry average for freemium apps is 2%. Top performers hit 4-5%. Duolingo's 8.9% is exceptional and largely attributable to their reverse trial model and extreme experimentation velocity. Lenny Rachitsky's benchmark data suggests that anything above 5% puts you in the top decile.

How many A/B tests does Duolingo run?

Duolingo runs 300+ experiments per quarter company-wide, with 20-80 running simultaneously at any time. The growth team specifically manages 5-8 parallel experiments. This translates to roughly 1,200 experiments per year — a pace most companies can't match without dedicated experimentation infrastructure or autonomous testing tools.

What should I test first if I want to improve my app's onboarding?

Start with sign-up wall placement. Test moving your account creation screen behind your product's first value moment. This single change produced Duolingo's largest measured lift (+20% DAU). It's high-impact, easy to implement, and the results are typically visible within one experiment cycle.

Did Duolingo ever kill a successful experiment?

Yes. Duolingo tested promoting offline lesson downloads — it increased subscription sign-ups but decreased daily active users. They discontinued it. This decision reveals their core principle: never trade long-term engagement for short-term revenue, even when the revenue numbers look good.