The Problem
Growth was the goal — but the team couldn't move fast enough to achieve it, and users weren't converting at the rate they should.
Business Growth
- Home loan research is non-linear — users arrive from search, product pages, and calculators at different points in their decision
- The site wasn't building enough confidence at each touchpoint to carry users forward to an enquiry
Operation & Delivery
- Without a shared component system, every change required design or engineering involvement — making experimentation slow and time to market unpredictable
Design Consistency
- Pages were built independently with no shared visual language — running experiments confidently was difficult when the baseline kept varying
- Without a consistent foundation, A/B testing results were harder to interpret — and every iteration required more rework than it should
All of this sat within a regulated environment. Legal and compliance requirements — around what could be displayed, collected, and communicated — shaped design decisions throughout.
How I Approached It
This wasn't a linear project. Multiple initiatives ran in parallel — page redesigns, component builds, research, and experimentation happening concurrently across the home loan experience. I structured the work into three milestones, sequenced deliberately: understand the problem first, build the infrastructure to solve it, then use that infrastructure to drive conversion.
Milestone 1 — Understand & Plan
Strategy & Audit
Before any design work began, I ran a UX and content audit across the full home loan experience to understand what existed, where the gaps were, and what to prioritise. From there I built a roadmap linked to business outcomes and aligned it with the product owner before work started — so priorities were agreed, not assumed.
- UX and content audit — reviewed all home loan pages for consistency gaps, flow breakdowns, and content issues
- Prioritised roadmap — ranked initiatives by business impact with clear rationale for sequencing
- Stakeholder alignment — presented the plan to the product owner and key stakeholders before any design began
Research
I worked with the internal CX team and an external research agency to run a continuous mixed-methods practice. The internal team provided quantitative scale — analytics on what users did, surveys on what they said. The external agency provided qualitative depth — focus groups and interviews to understand why.

Milestone 2 — Build the Foundation
Two types of infrastructure needed to be in place before the conversion work could be meaningful: a design system for consistency and speed, and an experimentation capability for evidence-based decisions. Both ran in parallel with other workstreams.
Design System
The design system was built using atomic design principles — from base elements up to full page sections. The goal was to create a shared visual language that could scale across multiple concurrent projects without losing consistency. I worked with front-end engineers to implement components in AEM, carefully defining what content managers could control versus what required a code change.
- Components built: product panels, comparison tiles, contact blocks, hero sections, sub-navigation, FAQs, and most standard page sections
- Design → test → validate loop — key components were tested before handoff, ensuring the best-performing version went to engineering
- Accessibility uplift — all components built to WCAG 2.2 standards, above the AA baseline
- Two development layers — AEM content manager (CMS) and front-end platform — each with different constraints on what could be changed
- Influenced adoption — worked with the product owner and design team to make the library the default way pages were built, not just a resource that existed
Experimentation Practice
Alongside the design system, I introduced a structured approach to experimentation — moving the team from opinion-based decisions to evidence-based iteration. Every significant design decision became a testable hypothesis with a clear metric. This capability ran through all subsequent work.
- Established testing process — hypothesis → variant → measure → iterate
- Tests run: [X ← fill in]
- Covered components, page layouts, calculator flows, and lead form — anywhere a design decision had a measurable conversion impact
Milestone 3 — Drive Conversion
With the foundation in place, the focus shifted to the full conversion journey — from a user landing on the site to submitting an enquiry. Research and market insights informed the hypotheses; experimentation determined what shipped. Each touchpoint in the journey had a different design challenge.
Page Experience
Category pages, product pages, and navigation were redesigned based on research findings and market insights. Rather than committing to a single direction, I tested multiple design approaches — layouts, content hierarchies, component arrangements — and used the highest-converting versions as the standard going forward.
- Research-informed hypotheses — market insights and user research shaped which variants were worth testing
- Multiple variants tested per page type — layout, content hierarchy, and visual treatment all explored
- Best-performing versions shipped — experimentation, not opinion, determined what went live
Calculators
The calculators — borrowing power, repayment, and stamp duty — were the most important confidence-building touchpoint. Research surfaced a key reframe: users weren't coming to calculate. They were coming to answer one question: 'Can I afford this?' That insight changed the entire design approach.
- Before: [fill in — what made the calculators hard to use ← overwhelming inputs, jargon, no clear result]
- Progressive disclosure — stepped inputs so users weren't confronted with complexity upfront
- Results-first hierarchy — the answer is immediately visible; breakdowns available on demand
- Real-time feedback — results update as users adjust inputs, making it exploratory rather than transactional
- Contextual CTAs — each result page surfaces the most relevant next step based on what was just calculated
- Compliance: ASIC required results to be marked as indicative — designed a compact disclaimer pattern that kept the key number prominent while meeting the legal visibility requirement
Lead Submission Form
The lead form was the final step — and the point where friction had the most direct impact on conversion. I redesigned the layout and field order to reduce cognitive load, then ran experimentation to validate what actually moved the needle. Responsible lending obligations required specific financial data collection before a lead could progress — I worked with the product owner to find an approach that satisfied the legal requirement without creating unnecessary drop-off.
- Redesigned layout and field order — reduced complexity upfront, reframed the ask to feel lower-stakes
- Ran [X] experiments ← fill in — testing CTA copy, field labels, step count, and trust signal placement
- Most significant result: [what you tested and what happened ← fill in]
- Compliance: worked with the product owner to map the UX impact of different data collection approaches, informing the conversation with legal about what was feasible
Outcome
The primary metrics we tracked were calculator completion rate, lead form submission rate, and content team velocity. Specific performance data is confidential — available on request.
- Calculator completion rate improved — users were getting to a result and taking a next step more often than before
- Lead form submission rate increased following redesign and experimentation — specific uplift available on request
- Experimentation shifted the team from assumption-based decisions to evidence-based iteration — [X] tests run across calculator and form flows
- Content managers could update page content directly in AEM without design or engineering involvement — reducing turnaround time for routine updates
- Component library established a shared visual language across home loan pages — used by [X] designers across [X] pages
- Roadmap and audit gave the team a clear view of what to work on next — reducing ad hoc requests and improving planning visibility
Reflection
[Fill in when ready — what would you do differently? Think: earlier research, more aggressive experimentation, better cross-team documentation, or a specific decision you'd revisit.]

