Wellcome Collection
— Mobile Exhibition Companion App
Designing an end-to-end exhibition experience that integrates spatial audio guidance into a unified mobile platform.
My Contribution
Defined the end-to-end app experience (information architecture → key flows).
Designed core UI + component system for scalable screens and handoff-ready specs.
Prototyped and validated key tasks (discover → navigate → save/revisit).
Who is Wellcome Collection?
Wellcome Collection is a museum in central London built around the belief that everyone's experience of health matters. Its exhibitions aim to be inclusive across different visitor needs.
01 Make the visit more accessible and inclusive
Specifically, reduce/remove barriers across the building, services, and programme.
02 Support diverse access needs with clear, on-demand help
Keep improving through visitor feedback.
03 Offer accessible ways to experience exhibitions
Including options like audio descriptions (and, for some content, optional directions between stops).
The Research
To ground the project, I audited the Wellcome Collection's physical and digital touchpoints through field observation and competitive scanning.
The Friction
Discovered a consistent pain point: fragmented information across print, web, and borrowed devices. This creates significant navigational barriers, particularly for blind and low-vision visitors.
The Solution
These insights led to a single mobile companion that integrates exhibition content and audio guides into one flow. By featuring spatial-audio wayfinding, the design ensures a seamless and confident visitor journey.
From the research, "accessible" isn't only about compliant content, it's about:
01 Independent
Let visitors preview, navigate, and revisit exhibition content in one place, so the museum experience doesn't break when the visit starts or ends.
02 Low-friction museum navigation
Support effortless movement through the museum by minimising the need to stop, ask for help, or switch tools.
03 Understanding in real time
Give visitors the right information in context (where they are, what's nearby, what to do next), without searching, borrowing devices, or switching channels.
For blind and low-vision visitors, accessibility means the experience works while moving, supports orientation when the environment changes, and lets visitors re-engage with what they explored afterward.
Embedding spatial audio inside a mobile app
Guidance breaks down when it lives outside the main experience. At Wellcome, exhibition info is mostly web-based, and the audio guide can mean borrowing a device or hunting for the right page, right when visitors are moving and need clarity.
So I designed a mobile companion app that brings exhibition content and the audio guide into one flow, then layers optional spatial-audio wayfinding inside that same journey. This keeps accessibility lightweight (no new infrastructure), makes support instant and self-directed, and helps visitors navigate confidently while staying connected to what they're encountering.
How did I validate and refine the solution?
01 Wireframes + Sketches
Tested guided discovery on mobile (browse exhibitions + optional spatial-audio tour).
02 Audit + Observation + Feedback
Web-first content + separate audio guide created friction during movement → pivot to one app.
03 v1 → v2 prototypes + Feedbacks
Figma prototypes with usability testing. Then refine navigations, content access, and the guidance flow, locking the core features for the app.
Key Features
Integrated Gallery Discovery
Converts broad web content into a native, scannable interface for instant access to "What's On" lists, exhibition summaries, and current locations.
Bring the audio guide into the main visit flow.
Visitors can start the audio guide directly inside the app. A simplified numerical interface that allows visitors to perform instant object lookups by entering room or item numbers, bypassing deep menu navigation.
Centralized Visitor Utility
Consolidates essential logistics, such as live opening times, step-free access routes, and direct contact links—into a single, accessible hub.
Multi-Level Wayfinding
Provides an interactive floor plan across all levels, offering real-time spatial context to help visitors locate galleries and amenities effortlessly.
Concept Extension
Add optional spatial-audio wayfinding when navigation breaks down.
When tactile cues aren't consistent across floors, the app offers an optional spatial-audio guidance mode that helps blind and low-vision visitors move between stops.
Here's how the spatial audio guide might sound
Click to listen (earphones recommended):
Even as a concept, the work validated the experience structure and reduced friction at key moments of movement.
Defined an end-to-end journey
Pre → during → post visit flow that keeps content, guidance, and "revisit later" in one system.
Reduced "search + borrow" friction
Audio guide and exhibition content move from web/device dependency into a single mobile flow.
Made wayfinding support self-directed
Visitors can start/stop guidance instantly without staff mediation or extra hardware.
Created a scalable foundation
A component-driven UI system supports future exhibitions, tours, and accessibility modes.
Supporting Materials
Process Document Route ↗Next step for Wellcome Collection App
My Takeaways
This project reshaped accessibility for me—information hierarchy, user flow logic, and interaction patterns. When accessibility leads the structure, the experience becomes clearer for everyone.
Designing for "in-the-moment" use forced me to prioritise clarity, timing, and cognitive load over feature completeness.
Even as a designing for a future concept feature, reframed how interactions usually work beyond screens. Prototyping became a tool for thinking, alignment, and decision-making, not just validation.