CHALLENGE
Support isn’t a nice-to-have. It’s the difference between a working public service and a broken one
Roughly one in five people need help to access digital government services. If that help doesn’t exist — or worse, if it’s badly designed — you don’t just get a frustrated user. You get exclusion. You get policy failure. You build something that technically “launched” but functionally doesn’t work.
We were tasked with designing a support model for a brand-new compliance service — one with national implications. No legacy system, no subject matter experts, and limited access to users. Just a high-stakes alpha assessment on the horizon, and a tight deadline.
The challenge? Conceptualise and de-risk an end-to-end support experience from scratch — one that could handle the needs of everyone from smallholder farmers to compliance officers at global FMCGs. Digital confidence varied wildly across this user base, and we had to anticipate what might trip people up.
At the same time, we had to design within the operational constraints of a support team that hadn’t even been onboarded yet. That meant every solution had to be pragmatic, adaptable, and ready to plug into real delivery. We rapidly prototyped support journeys and stress-tested them with users and government leads — enough to give confidence at alpha without sacrificing fidelity or ambition.
APPROACH
A departure from headline metrics that don’t say enough – Product and service owners need meaningful insights
1. Insight-Driven Scenarios
The real test of a support system isn’t what happens when everything works — it’s what happens when it doesn’t. Error messages. Confusing guidance. Escalations. Digital dead ends. So we went deep into existing user research to build a full-spectrum view of the support needs we’d have to design for.
I created a scenario bank that mapped out every possible friction point along the journey — from initial misunderstandings and access barriers to backend failures and urgent compliance queries. Each scenario was grounded in real user insight pulled from discovery, so we weren’t designing around guesswork. This became a keystone artefact — aligning delivery teams, policy leads, and service owners on the true scope of what support needed to do.

Fig.1 – Sample visual of scenario cards we developed

Fig.2 – Sample visual of scenario cards across the end to end journey
2. Reducing the ‘Time Tax’
Support shouldn’t waste your time. Especially when you’re on a compliance deadline. Our users — farmers, traders, supply chain managers — don’t have hours to spend wrestling with a broken screen or unclear guidance. Worse, delays could lead to regulatory penalties.
Using our scenario bank, I led co-creation workshops with policy leads, service teams, and future support staff to build out practical support journeys. We mapped escalation paths, backstage actions, and identified which parts of the process could be automated — like using IVAs to triage queries fast. We also designed for digital exclusion, making sure even the least tech-savvy user could still get what they needed, fast.

Fig.3 – Sample visual of support user journeys for different scenarios
3. De-Risking the Concept (Just Enough)
There were still unknowns: how the support team would be staffed, what tooling they’d need, or how assisted digital support would be delivered. But waiting for certainty wasn’t an option.
So we tested what we could. We zoomed into high-risk assumptions and prototyped just enough to give confidence. We tested the placement of support entry points, the clarity of email communications, escalation flows — not just with users, but with the internal teams likely to own support. What mattered was demonstrating that support wasn’t a bolt-on — it was built in. Users knew help was available, knew how to access it, and trusted it would work.
And critically, government leads could see a path to delivery. What we designed looked a lot like what they’d actually be able to run.
IMPACT/ OUTCOMES
An inclusive, resilient support experience
Passed Alpha, Entered Private Beta:
The service passed its GDS Alpha assessment with confidence in the support model — a critical enabler for compliance at scale
Accelerated Future Delivery:
By validating key elements early, we saved 3–4 weeks of research and testing in Beta — giving the team a head start on implementation.
Reduced Risk of Policy Failure:
The support offer gave users a clear path to success and surfaced operational blockers early — reducing exclusion risk and strengthening the policy’s real-world impact.