Wellcome Collection

— Mobile Exhibition Companion App

Designing an end-to-end exhibition experience that integrates spatial audio guidance into a unified mobile platform.

Wellcome Collection app screens

Role

Product Designer
UI+UX Designer
Researcher

Timeline

April - May 2025

Team

Solo work

Tools

Figma
Illustrator
Miro

My Contribution

Defined the end-to-end app experience (information architecture → key flows).

Designed core UI + component system for scalable screens and handoff-ready specs.

Prototyped and validated key tasks (discover → navigate → save/revisit).

Who is Wellcome Collection?

Wellcome Collection is a museum in central London built around the belief that everyone's experience of health matters. Its exhibitions aim to be inclusive across different visitor needs.

Wellcome Collection photos and logo

What did the client want?

01 Make the visit more accessible and inclusive

Specifically, reduce/remove barriers across the building, services, and programme.

02 Support diverse access needs with clear, on-demand help

Keep improving through visitor feedback.

03 Offer accessible ways to experience exhibitions

Including options like audio descriptions (and, for some content, optional directions between stops).

The Research

To ground the project, I audited the Wellcome Collection's physical and digital touchpoints through field observation and competitive scanning.

The Friction

Discovered a consistent pain point: fragmented information across print, web, and borrowed devices. This creates significant navigational barriers, particularly for blind and low-vision visitors.

The Solution

These insights led to a single mobile companion that integrates exhibition content and audio guides into one flow. By featuring spatial-audio wayfinding, the design ensures a seamless and confident visitor journey.

What does "Accessible" mean for the user?

From the research, "accessible" isn't only about compliant content, it's about:

01 Independent

Let visitors preview, navigate, and revisit exhibition content in one place, so the museum experience doesn't break when the visit starts or ends.

02 Low-friction museum navigation

Support effortless movement through the museum by minimising the need to stop, ask for help, or switch tools.

03 Understanding in real time

Give visitors the right information in context (where they are, what's nearby, what to do next), without searching, borrowing devices, or switching channels.

For blind and low-vision visitors, accessibility means the experience works while moving, supports orientation when the environment changes, and lets visitors re-engage with what they explored afterward.

How I design the solution?

Embedding spatial audio inside a mobile app

Guidance breaks down when it lives outside the main experience. At Wellcome, exhibition info is mostly web-based, and the audio guide can mean borrowing a device or hunting for the right page, right when visitors are moving and need clarity.

So I designed a mobile companion app that brings exhibition content and the audio guide into one flow, then layers optional spatial-audio wayfinding inside that same journey. This keeps accessibility lightweight (no new infrastructure), makes support instant and self-directed, and helps visitors navigate confidently while staying connected to what they're encountering.

How did I validate and refine the solution?

01 Wireframes + Sketches

Tested guided discovery on mobile (browse exhibitions + optional spatial-audio tour).

02 Audit + Observation + Feedback

Web-first content + separate audio guide created friction during movement → pivot to one app.

03 v1 → v2 prototypes + Feedbacks

Figma prototypes with usability testing. Then refine navigations, content access, and the guidance flow, locking the core features for the app.

Key Features

Integrated Gallery Discovery

Converts broad web content into a native, scannable interface for instant access to "What's On" lists, exhibition summaries, and current locations.

Exhibition list browsing demo
Exhibition page and audio guide demo

Bring the audio guide into the main visit flow.

Visitors can start the audio guide directly inside the app. A simplified numerical interface that allows visitors to perform instant object lookups by entering room or item numbers, bypassing deep menu navigation.

Centralized Visitor Utility

Consolidates essential logistics, such as live opening times, step-free access routes, and direct contact links—into a single, accessible hub.

Visitor utility hub
Multi-level floor plan wayfinding

Multi-Level Wayfinding

Provides an interactive floor plan across all levels, offering real-time spatial context to help visitors locate galleries and amenities effortlessly.

Concept Extension

Add optional spatial-audio wayfinding when navigation breaks down.

When tactile cues aren't consistent across floors, the app offers an optional spatial-audio guidance mode that helps blind and low-vision visitors move between stops.

Spatial audio wayfinding concept

Here's how the spatial audio guide might sound

Click to listen (earphones recommended):

Spatial Audio Guide — Being Human exhibition
Spatial Audio Guide — 1880 exhibition

What was the outcome/impact?

What was validated / what changed

Even as a concept, the work validated the experience structure and reduced friction at key moments of movement.

Defined an end-to-end journey

Pre → during → post visit flow that keeps content, guidance, and "revisit later" in one system.

Reduced "search + borrow" friction

Audio guide and exhibition content move from web/device dependency into a single mobile flow.

Made wayfinding support self-directed

Visitors can start/stop guidance instantly without staff mediation or extra hardware.

Created a scalable foundation

A component-driven UI system supports future exhibitions, tours, and accessibility modes.

How do we go from here?

Next step for Wellcome Collection App

Pilot in-gallery usability tests

Test wayfinding + exhibit comprehension across key transitions (entrance → lifts → galleries → exits), and capture where confidence drops.

Validate accessibility modes end-to-end

Screen reader flow, text scaling, reduced motion, high contrast, haptics, and "no-audio" fallback so the journey still works.

Prototype the "spatial layer" with lightweight triggers

NFC/QR start points + directional audio cues; measure time-to-arrive, wrong turns, and how often users need help.

Connect content to the visit lifecycle

Pre-plan a route, in-gallery "now playing" content, and post-visit saved moments—so the app supports before/during/after, not just navigation.

My Takeaways

This project reshaped accessibility for me—information hierarchy, user flow logic, and interaction patterns. When accessibility leads the structure, the experience becomes clearer for everyone.

Designing for "in-the-moment" use forced me to prioritise clarity, timing, and cognitive load over feature completeness.

Even as a designing for a future concept feature, reframed how interactions usually work beyond screens. Prototyping became a tool for thinking, alignment, and decision-making, not just validation.