Background
Suncorp Bank's home loan website is a primary channel for customer acquisition.
Post-COVID, customer behaviour shifted. Rate volatility drove more refinancing, more comparison shopping, and more digital-first research before any human contact. The site needed to build confidence at every touchpoint and convert visitors into loan enquiries.
The work is structured around three focus areas: diagnose the problem, enable the team to move faster, then convert. One goal: improve conversion from visit to enquiry submission. Each focus area addressed a different barrier to getting there.
In practice, all three ran in parallel. Research was ongoing, the design system evolved alongside delivery, and conversion work started as soon as the first components were ready.
The Problem
Growth through conversion was the goal, but three things were blocking it.
- Customers weren't converting - Customers arrived from search, comparison sites, and calculators at different stages. The site lacked clear comparison, plain-language content, and confidence signals to move them from research to enquiry.
- The team couldn't move fast enough - Existing components weren't scalable or connected to AEM (Adobe Experience Management). Most page changes still required design and engineering. Content managers couldn't update independently, and compliance review added weeks to each cycle.
- Results couldn't scale - Components and patterns across the pages weren't consistent enough to scale. A/B results were unreliable because the baseline varied page to page, and winning patterns had no efficient way to roll out.
Diagnose
Strategy & Audit
The strategy started with existing data. Working with data analysts, I reviewed conversion funnels, previous research, and site analytics to understand where customers were dropping off and why.
A UX and content audit across 30+ pages surfaced the gaps. From there, I identified where the biggest opportunities for improvement were.
The same friction points existed across other products. Conversations with 5 UX designers working across other banking products confirmed a broader pattern: slow delivery cycles, conversion gaps with limited ways to test solutions, and UI inconsistency across deliveries between designers. Home loans became the starting point, but the patterns built here were designed to scale across products.
Priorities were set before any design or research began. I prioritised a roadmap aligned to business and customer impact, then aligned product owners and key stakeholders on the direction. Priorities were agreed, not assumed.

Research
Research was structured around business goals and customer objectives. I set the direction, briefing CX researchers and the external research agency on priorities aligned with product owners. The CX team delivered quantitative scale through analytics and surveys. The external agency brought qualitative depth through focus groups and interviews.

Insights aligned the whole team around what customers actually needed. This built a continuous mixed-methods approach that informed every design decision that followed. Insights were shared back to product owners, content managers, and content producers.

Different segments had fundamentally different priorities. The research surfaced a clear pattern: the site treated them all the same. For example,
- First home buyers needed confidence and education. They didn't know what to compare or what questions to ask.
- Existing homeowners wanted to understand their options for their next property. Speed and clarity mattered most.
- Refinancers were comparison shopping across banks. They needed to see rates, fees, and switching costs upfront.

Product comparison was the biggest friction point. Even within the bank's own products, customers found it hard to understand the differences. This shaped the core design direction: restructure information hierarchy around what each segment pays attention to first.
Each segment moved through the site differently. First home buyers explored broadly before narrowing. Refinancers went straight to rates and calculators. These user flows informed the page structure and navigation decisions that followed.
Enable
Before conversion work could be meaningful, the consistency and delivery problems needed solving. The priority was a design system for speed and consistency, supported by a collaboration model that kept the team aligned as it evolved. Both evolved alongside ongoing delivery.
Design System
The existing library was dated and disconnected from AEM. Organisational changes had left gaps in resourcing, so rebuilding wasn't straightforward. I scoped what was feasible within those constraints and started evolving the library using atomic design principles, from base elements up to full-page sections.
A key challenge was AEM's two development layers — what content managers could control in the CMS versus what required front-end engineering. Working with front-end engineers, I defined clear boundaries between the two, so components could be updated by content managers without design or engineering involvement.
30+ components and sections were delivered over time — product panels, comparison tiles, contact blocks, hero sections, sub-navigation, FAQs, and most standard page sections. New components were built with accessibility in mind, and improvements were delivered progressively alongside feature work. The result: turnaround for page updates dropped from weeks to days.

Hero section
The hero section is a good example. The site's information architecture defined five page types — primary, secondary, tertiary, campaign, and supporting. The atomic components adapted across all five, with a guideline playbook for each defining content rules, visual treatment, and usage standards for teams to follow.
Collaboration & governance
Alignment across the team was built into the process from the start. I facilitated bi-weekly UX alignment meetings across the web team, reviewing components, surfacing discrepancies, and maintaining consistency across deliveries.
- Component governance — owned the contribution process for the library. New components or changes proposed by other designers went through review, team discussion, and approval before merging to the main branch
- Drove adoption — worked with product owners and the design team to make the library the default way pages were built, not just a resource that existed
- Coaching — upskilled other designers on the system's principles, contribution workflow, and quality standards so the library could grow without bottlenecking through one person
Convert
With the design system in place, the team could move faster. Each touchpoint in the conversion journey had a different design challenge.
Experimentation
Experimentation was built as a practice, not a one-off. Partnering with the product owner, we built a structured experimentation practice, testing design hypotheses with clear metrics to determine what shipped. The design system made it possible to ship variants in days rather than weeks.
Testing priorities came from the roadmap. Components that appeared across multiple pages and sat closest to conversion were tested first.
- Component experimentation — CTA banners, campaign banners, hero sections, and personalisation variants (location, segment, browsing context) were tested to find what converted best
- Page-level experimentation — tested component ordering, content hierarchy, and navigation patterns to find the highest-converting layouts across page types

Calculators
Calculators were the most critical confidence-building touchpoints. Borrowing power, repayment, refinance, multi-product comparison, and stamp duty. They sit at the self-qualification stage: 'Can I afford this?' For the bank, a customer who completes a calculator and sees a positive result is far more likely to request a callback or start an application.
Customers weren't coming to calculate. They were coming for reassurance. Research showed that most customers already had a rough idea of what they could afford. What they needed was confirmation, not a blank form. That shifted the design from data collection to confidence building.
Each calculator served a different scenario with different logic. The challenge was that each had different inputs and financial requirements. A first home buyer with a 10% deposit has a fundamentally different flow from a refinancer comparing their current rate. I worked through the logic for each, deciding which inputs to show upfront, which to reveal progressively, and how results should adapt to the customer's situation.
Compliance added a design tension. ASIC required all results to be marked as indicative, which risks undermining the confidence the calculator is meant to build. I designed disclaimer patterns that met the legal requirement without burying the key number. I partnered with a third-party engineering agency, translating the UX decisions, financial logic, and compliance constraints into a clear specification.
- Progressive disclosure — stepped inputs so customers weren't confronted with complexity upfront
- Results-first hierarchy — the answer is immediately visible; breakdowns available on demand
- Contextual CTAs — each result page surfaces the most relevant next step based on what was just calculated
The result: calculator completion rates and lead submissions improved significantly, with strong positive customer feedback.

Multi-product comparison calculator. Redesigned as a reusable pattern embedded across pages within the customer's search journey. Customers input their financial situation (deposit, property value, loan type) and the calculator filters to relevant products. Expanding each result reveals features and details, giving customers the confidence to apply, request a call, or continue researching.

Repayment and refinance calculators. The repayment calculator turns a loan amount into a concrete monthly figure, reducing uncertainty. The refinance calculator quantifies potential savings, creating an immediate incentive for existing borrowers to switch. Both designed to convert research into action.
Lead touchpoints
Three channels needed to work together as one experience. The final stage of the funnel, where research and evaluation turned into action: an online enquiry form, a request-a-callback flow, and the entry point to online origination.
Experimentation validated what moved the needle. I redesigned the layout and interaction across all three to reduce cognitive load, then ran experimentation to find what increased submission rates. Responsible lending obligations required specific financial data before a lead could progress. I worked with the product owner to satisfy the legal requirement without creating unnecessary drop-off.
- Reduced upfront complexity — reframed the ask to feel lower-stakes and restructured field order based on what customers expected to provide first
- Tested across CTA copy, field labels, and trust signals — experimentation determined which combination increased submission rates
- Three channels unified — form, callback, and origination entry point redesigned as one cohesive experience rather than three separate flows

The redesigned form leads with the key concerns surfaced in research: whether the service is free, what to expect from the call, and how to choose a callback time that fits the customer's schedule. Personalisation was applied here too, with imagery adapted based on location and loan type to increase relevance.
Page experience
Category pages, product pages, and navigation were redesigned to reflect the user flows surfaced in research: customers don't follow a linear path. They arrive at different stages, compare across products, and need clear signals to keep moving forward.
Design system, experimentation, and research converged here. The design system provided the components. Experimentation determined which variants converted best. Research shaped the information hierarchy for each segment.
- Information hierarchy restructured by segment — first home buyers, refinancers, and existing homeowners each saw content prioritised around what mattered most to them
- Navigation and product comparison simplified — addressing the biggest friction point from research: customers couldn't easily compare products, even within the bank's own range
- Consistent visual language across 30+ pages — every page built from the same design system, tested through experimentation, and informed by research

Home loan category page redesigned with a consistent visual language from the design system, components shaped by experimentation outcomes, and content restructured around customer needs surfaced in research, including an integrated calculator to help customers find the right product for their financial situation.
Reflection
What worked well
- Systems built in parallel with delivery Building the design system and experimentation practice alongside delivery meant the team never stopped shipping while the systems matured. By the end, both were embedded into how the team worked, not dependent on any one person.
- Evidence-based direction from day one Starting with data and a prioritised roadmap before any design work meant every initiative had a clear rationale. Stakeholder alignment was faster because the direction was evidence-based, not opinion-based.
What I'd do differently
- Earlier concept testing Some design directions went too far into production before being validated with customers.
- Document the experimentation playbook So the process could scale beyond the people who built it.
- Optimise the design system workflow for offshore teams Working with offshore front-end teams and limited budgets slowed the discovery and implementation cycle. Next time, I'd establish a more efficient workflow earlier, streamline contribution governance, and invest more in coaching to improve team velocity.

