How do you communicate
when you can’t rely on sound?

Play Beyond Sound

Play Beyond Sound

Play Beyond Sound is an accessible communication system for Deaf and hard-of-hearing players in multiplayer games. It introduces a new communication layer that translates urgency, tone, and intent into multi-sensory signals, making real-time interaction more perceivable, expressive, and inclusive.

Play Beyond Sound is an accessible communication system for Deaf and hard-of-hearing players in multiplayer games. It introduces a new communication layer that translates urgency, tone, and intent into multi-sensory signals, making real-time interaction more perceivable, expressive, and inclusive.

services

services

Product Design
Interaction Design
System Design

Product Design
Interaction Design
System Design

timeline

timeline

Nov 2025 -
Apr 2026

Nov 2025 -
Apr 2026

team

team

Individual Project

Individual Project

tools

tools

Figma
Cursor
Claude Code

Figma
Cursor
Claude Code

The Problem

This project started from a question I came across on Reddit:

This revealed a broader issue:
Multiplayer games rely heavily on voice communication,
limiting participation for Deaf and hard-of-hearing players.

Interview

To understand this gap, I spoke with Deaf and hard-of-hearing players.

If a game relies too much on voice chat, I usually just avoid it.

If a game relies too much on voice chat, I usually just avoid it.

Sometimes I can’t tell if they’re joking… or warning me.

Sometimes I can’t tell if they’re joking… or warning me.

Captions aren’t where I need them. I can’t keep up during gameplay.

Captions aren’t where I need them. I can’t keep up during gameplay.

Voice-first communication
leads to exclusion.

Tone is lost in text-based communication.

Existing caption solutions aren’t integrated into gameplay.

Insights

Communication isn’t just what is said—it’s how it’s said.

Urgency

Timing

Intent

Feature 1

I started with captions.

An Emotion-Augmented Caption System that makes tone, urgency, and nuance visible at a glance.

Loudness

Intensity

Pacing

Emotion

Delivery

size

weight

spacing

color tone

motion

user testing

One of the biggest design shifts emerged from User testing with deaf players.

1

1

AI emotion labels don’t hold up in gameplay.

From

Emotion labels added noise
without improving understanding

Emotion labels added noise
without improving understanding

Removed emotion layer
for clarity and simplicity

Removed emotion layer
for clarity and simplicity

to

2

Speaker identification should be instant.

That was smart, nice job

Speaker 1:

from

Requires reading to identify speaker

Requires reading to identify speaker

Instant visual identification

Instant visual identification

That was smart, nice job

color-coding

avatar/icon

That was smart, nice job

to

3

One system doesn’t fit all players.

Different play styles, different needs

So I introduced a customizable system.

Players can adjust caption intensity, speaker indicators, and placement based on their own needs and play style.

Overlay interface that can be adjusted in real time without leaving gameplay.

Feature 2

Even with a flexible visual system, communication still relied heavily on sight.

In fast-paced gameplay,
this quickly becomes overwhelming…

So I introduced haptic feedback as a complementary communication channel.

This translates critical signals into physical cues, allowing players to notice alerts without looking away and reducing overload by distributing attention across senses.

Feature 3

While communication became perceivable, responding remained limited.

In fast-paced gameplay, players need to react just as quickly as they receive information.
But existing systems rely on simple pings, which are limited in what they can express.

What is the fastest way to express intent without voice?

What is the fastest way to express intent without voice?

What if emotion could be combined with simple inputs?

What if emotion could be combined with simple inputs?

What if players could define those expressions themselves?

What if players could define those expressions themselves?

I designed the system along a spectrum of speed and clarity.

Speed

Speed

Speed

Clarity

Clarity

Clarity

[Drag UI]
Instant selection

[Distance UI]
Precise intensity

[Chaining UI]
Rapid combinations

[Highlight UI]
Clear visibility

The interface is intentionally minimal to avoid distracting from gameplay,
prioritizing speed and intuitive expression.

iteration

Improving Interaction Visibility

From

Blends into gameplay

To

Separated from gameplay

Clarifying Active Signals

From

No clear state indication

To

Clear active state feedback

Clarifying Intensity Feedback

From

Hard to perceive intensity

To

Intensity becomes visible and controllable

Final Prototype

Clearer communication in real time.

Powered by gesture-based selection and combined signals in one motion.

Less noise

Better visual tracking

Stronger contrast

Customize signals

See them in action

Fast, expressive communication in gameplay

System Summary

Together, they form the Social Communication Accessibility Framework.

A multi-modal communication system that translates voice into signals that can be seen, felt, and expressed.

impact

User testing showed clear improvements in how players perceived and responded to communication.

The one with bigger font felt like a clear command,

while the smaller one felt more like a suggestion.

It would make communication much more accessible and help me stay in sync with teammates without relying on voice.

It helps me react faster and understand what’s happening without relying on voice chat.

And importantly, all six participants said they would use this in a real game.

Reflection

Reflection

Accessibility is not a layer on top of a system; it is the system.

Accessibility is not a layer on top of a system; it is the system.

This project started as a caption problem, but evolved into a communication system.
I learned that accessibility is not about adding features, but about rethinking how information flows.

This project started as a caption problem, but evolved into a communication system.
I learned that accessibility is not about adding features, but about rethinking how information flows.

Designing for real-time interaction required not just adding more signals,
but deciding what actually matters in the moment.

Design Decision

Through testing, I learned that clarity is more important than complexity,

and that users need control over how they receive and express information.

User Centered

With Thanks‪‪❤︎‬

With Thanks‪‪❤︎‬

This project would not have been possible without the people who shaped it along the way

This project would not have been possible without the people who shaped it along the way