The main checkout and pricing flows are critical for the business, but also complex. The goal was to find design changes that could increase conversions without hurting user trust.
At CyberGhost, experiments on pricing and checkout were a joint effort between the CRO team and design. Our goal was to understand user behaviour in critical flows and find design changes that could improve conversions or revenue.
+19.6%
Add-on conversion
uplift (Exp. 1)
+6%
Average order value
(Exp. 1)
Every experiment followed the same structured loop — data-informed hypothesis, design, build, measure, and learn.
Hypothesis — The CRO team defined hypotheses based on analytics, heatmaps, and survey feedback. For example: "Users lose the checkout button when scrolling" or "Expired users ignore generic pricing pages."
Designing the variations — I created layouts, pricing blocks, and checkout flows to test. Every variation was based on CRO research inputs. The design role was to translate hypotheses into UI that felt natural and usable, while staying consistent with the brand and feasible for developers.
Implementation — Developers built the variations, and traffic was split between control and variation through Google Optimize.
Measurement — We tracked Conversion Rate (CR), add-on sales, and Average Order Value (AOV).
Analysis — We combined quantitative data (uplift / loss, significance, segmentation) with qualitative insights (heatmaps, survey quotes) to understand both what happened and why.
This process helped us quickly validate ideas, learn from user behaviour, and decide which design changes were worth rolling out globally.
Tools
Google Optimize
Hotjar / Mouseflow
Mixpanel
Control
Checkout button disappears when scrolling
Variation ↑ Winner
Sticky checkout button always visible
What the test was about: We wanted to see if keeping the checkout button always visible would reduce drop-offs and make it easier for users to complete their purchase.
Why we did it: Analytics showed many users scrolled down and lost sight of the button, especially on mobile. When that happened, many dropped off without buying.
The sticky button didn't change how many users bought the main VPN plan — but it did make more people add extras like Dedicated IP or Antivirus.
Out of ~60K visitors in each group, 334 users in the variation bought an add-on vs. 305 in the control — that's 0.55% vs. 0.51%, a +19.6% relative increase.
Add-on conversion rate — variation vs. control
The purple line (sticky CTA) stayed above the orange (control) most days — the uplift was steady, not a one-day spike
What we learned
Keeping the CTA visible gave users more confidence.
That confidence led to more add-on purchases.
Small UI changes can impact revenue without redesigning the whole flow.
💡 Impact — +19.6% more add-ons and +6% higher AOV: a clear business win from a single UI tweak.
Control
Standard pricing page with small "All plans include" section
Variation — No significant change
Added extra benefits (banners, screenshots, video) mostly below the fold
What the test was about: We tested if showing more product benefits on the pricing page would give users reassurance, increase trust, and lead to more conversions.
Why we did it: Surveys showed that 11.6% of users dropped off because they weren't sure what they'd get. Typical comments: "I don't know what I get" and "Is it router compatible?" We wanted to reassure users by showing benefits more clearly.
What happened: Overall Conversion Rate came in at –2.5% (not statistically significant). The extra benefits didn't change overall user behaviour.
Overall conversion rate — variation vs. control
Heatmaps confirmed most users stopped scrolling before reaching the benefits — even when they saw them, clicks and interaction were minimal
What we learned
Adding content at the bottom didn't increase sales.
Users decide fast, often before scrolling — key info must be at the top.
Placement matters more than quantity.
💡 Impact — We learned that benefits must be placed at the top of the page, next to the plans, where users actually make decisions.
Control — Winner
Hero plan (2 Years) centred and visually highlighted
Variation ↓ –17.5% CR
Reordered plans — hero plan removed from the spotlight
What the test was about: We wanted to see if changing the order of the pricing plans would help users compare better and buy more.
Why we did it: The idea was that a different order might highlight value better and improve conversions by making it easier for users to compare plans.
What happened: Changing the order confused users. Fewer people picked the 2-year plan (best value for the business), and more switched to shorter plans. This led to a 17.5% drop in overall conversions and reduced revenue.
Conversion rate — variation vs. control
Purple = all users ·
Orange = control (7.01% converted) ·
Turquoise = variation (8.8% converted). Despite the higher headline CR, deeper analysis showed users chose shorter, lower-value plans.
What we learned
Visual hierarchy matters: the Hero plan in the centre naturally drives attention and choices.
Moving it broke that flow — users picked less profitable plans.
Even small layout changes can cause big business losses.
💡 Impact — This test showed that even small layout changes can hurt business results. Moving the hero plan out of the spotlight led to fewer long-term subscriptions and a –17.5% drop in conversions.
Running experiments at scale on a live product is humbling. Even small UI changes can move revenue significantly — in both directions.
01
Visibility drives confidence
Keeping the CTA in sight at all times removed decision friction and made it easier for users to commit — even to add-ons they hadn't planned for.
02
Placement beats volume
Adding more content below the fold didn't help. Users make decisions fast, near the top of the page. Information that isn't seen doesn't convert.
03
Layout encodes intent
The central position of the Hero plan wasn't arbitrary — it guided users toward the most valuable choice. Disrupting that hierarchy confused users and hurt revenue.
04
Negative results are wins too
Two of three experiments didn't produce a conversion uplift. But they produced clear strategic insights that shaped future design decisions across the product.