All Work Analytics SaaS CRO · Pricing Design
+23% trial-to-paid conversion

Pulse — Turning Browsers into Buyers

Pulse Analytics had 40,000 monthly visits to their pricing page — but only 6.4% of trial users were converting to paid. The page had high intent traffic and a conversion problem. I led a structured CRO programme that identified the friction points and redesigned the page to convert.

Role
Senior UX Designer (CRO Lead)
Timeline
10 weeks
Team
Growth PM, 2 Engineers, Copywriter
Tools
Figma, Hotjar, Optimizely, Heap

A page with high traffic and a conversion ceiling

Pulse is a product analytics platform for SaaS companies. Their growth team had built a strong SEO and paid acquisition engine — 40,000 monthly visits to the pricing page, with high-intent traffic from comparison searches and review sites. Yet trial-to-paid conversion sat at 6.4%, well below the 10–12% benchmark for this category.

The commercial stakes were direct: a 3% lift in conversion would represent approximately $780K in additional ARR at their current pricing and traffic levels. The pricing page was, by a wide margin, the highest-leverage UX problem in the business.

Monthly visits
40,000
Trials started
2,560
Converted to paid
163
6.4% trial-to-paid conversion — 4pts below category benchmark
The conversion gap — strong traffic, a healthy trial rate, but significant drop-off at the payment decision

Friction at the moment of commitment

An initial heuristic review and heatmap analysis surfaced a cluster of compounding issues on the existing pricing page.

Hierarchy confusion: The most popular plan (Growth) was not visually distinguished from the others. No plan was recommended. Users reported feeling "unsure which to pick" in session replay comments.
Feature table anxiety: A 28-row feature comparison table below the pricing cards introduced too much information at the decision moment, creating analysis paralysis rather than confidence.
No social proof at the CTA: The "Start Free Trial" buttons appeared with no supporting trust signals — no testimonials, no company logos, no review scores — in the most critical scroll zone.
Pricing opacity on annual plans: The monthly/annual toggle existed, but the annual savings were buried. 78% of users never toggled it, despite annual plans being 40% more profitable for the business.
Unclear free trial terms: "14-day free trial" appeared once, in small text. Exit survey data showed 31% of non-converters thought they'd be charged immediately.
"I genuinely didn't know which plan I needed. I spent 15 minutes on the page and left to 'think about it.'"
— Exit survey respondent (non-converter, SMB)

Finding the real conversion killers

Before running any experiments, I spent three weeks building a rigorous evidence base. CRO without research is just guessing with nicer tools.

Behavioural Data
Hotjar heatmaps + scroll maps on 10,000 sessions
Session recordings: 80 non-converters, 40 converters
Heap funnel: pricing page → trial start → upgrade
Exit survey: 200 responses from non-converters
Qualitative
8 user interviews (4 converters, 4 non-converters)
Competitive teardown: 14 SaaS pricing pages
Sales team interviews on common objections

The most telling data point: session replays showed converters spent an average of 2.1 minutes on the page before clicking "Start Trial." Non-converters spent 4.8 minutes — more time, more confusion, no conversion. More information was working against us.

A hypothesis-led approach

I structured the work as a prioritised hypothesis backlog — ranking each change by confidence (evidence strength), impact (conversion potential), and ease (implementation cost). This prevented us from running nine simultaneous changes and not knowing which one worked.

H1
Recommended plan treatment
Adding a "Most Popular" badge and visual elevation to the Growth plan will reduce decision paralysis and increase Growth plan selection.
H2
Trust signals at CTA
Adding social proof (logos, review score, quote) adjacent to the primary CTA will increase trial start rate by reducing perceived risk.
H3
Annual savings surfacing
Showing the annual savings prominently (not behind a toggle) will increase annual plan selection and improve LTV from first conversion.
H4
Trial terms prominence
Making "No credit card required. Cancel anytime." a first-class UI element (not fine print) will reduce abandonment driven by commitment anxiety.

What changed and why

Starter
$29
per month
Get started
Most Popular
Growth
$79
per month
Start free trial
Scale
$149
per month
Get started
Redesigned pricing cards — recommended plan treatment with clear visual hierarchy and CTA prominence

The feature comparison table was replaced with a three-column "feature summary" format — the 8 most-cited decision factors from user research, displayed as simple checkmarks. Users who wanted the full comparison could expand it.

Trust signals were redesigned as a horizontal band immediately below the pricing cards: five company logos, a G2 review score (4.8/5 with 340 reviews), and a single customer quote. These moved from the page footer to the decision zone.

A staged test-and-ship process

Rather than redesigning the whole page and shipping it all at once, I advocated for a sequenced A/B testing programme — running one hypothesis at a time so we could attribute impact clearly.

Test 1 (weeks 3–5): Recommended plan treatment (H1). 2-week runtime, 50/50 split. Result: +11% increase in Growth plan selection, +7% overall trial start rate.
Test 2 (weeks 5–7): Trust signals at CTA (H2). 2-week runtime. Result: +9% trial start rate improvement. Statistically significant at 95% confidence.
Test 3 (weeks 7–9): Annual savings surfacing + trial terms (H3 + H4 combined). Result: 38% increase in annual plan selection, +6% on overall conversion.
Week 10: All winning variants rolled out as the permanent page. Full redesign of feature comparison section shipped as a follow-on in week 12.

Conversion gains that compounded

23%
Lift in trial-to-paid
conversion (6.4% → 7.9%)
$180K
Additional ARR
(annualised projection)
38%
Increase in annual
plan selection
2.1min
Average time on page
(from 4.8min for non-converts)
31%
Reduction in exit rate
from pricing page
19%
Uplift in LTV for
first-month cohort

What I learned about CRO

The most important lesson from this project: the highest-impact changes were the simplest ones. The recommended plan badge and trust signal placement — both changes that took under 2 hours to design and 4 hours to build — drove roughly 70% of the total conversion lift.

CRO isn't about clever design. It's about removing doubt at the moment of decision. The feature comparison table wasn't the problem — anxiety about commitment was. Every change that addressed the "is this safe?" feeling outperformed every change that addressed "does this look better?"

The annual plan toggle finding was unexpected: users didn't toggle it because they didn't know they wanted annual until we made it feel like the obvious choice. Framing and defaults matter more than offering choice. We could have run the whole programme as a single big-bang redesign, but the staged approach gave us something more valuable than a converted page — a repeatable methodology.

Back to Home

View All Projects

See All Work