directional siren indicator for DHH drivers
INDUSTRY:
PRODUCT DESIGN; HUMAN-CENTERED DESIGN
TOOLS:
SOLIDWORKS, KEYSHOT
YEAR:
2024
EXPERIENCE:
HCD, RAPID PROTOTYPING, UR, PHYSICAL COMPUTING

about.
For a one week long group project, we designed a proof of concept assistive device to help deaf and hard-of-hearing drivers stay safe on the road. It detects emergency sirens and translates them into visual cues, showing the direction the sound is coming from. Through user research and iterative testing, we’re shaping it into an intuitive, inclusive concept that reduces stress and improves driver confidence.
*To ensure high-fidelity presentation, some project images have been refined using AI for resolution and clarity. Please note that no project concepts, CAD data, or design solutions were AI-generated*
For a one week long group project, we designed a proof of concept assistive device to help deaf and hard-of-hearing drivers stay safe on the road. It detects emergency sirens and translates them into visual cues, showing the direction the sound is coming from. Through user research and iterative testing, we’re shaping it into an intuitive, inclusive concept that reduces stress and improves driver confidence.
*To ensure high-fidelity presentation, some project images have been refined using AI for resolution and clarity. Please note that no project concepts, CAD data, or design solutions were AI-generated*

challenge.
For drivers who have hearing disabilities, detecting the direction of an emergency vehicle can be challenging. With sirens granting you key context into the direction of the incoming emergency vehicle, missing this audible warning can lead to a delay in emergency response time and potential threat to those hard of hearing.
Four parties are thus affected by this: the DHH driver, the emergency vehicle driver, the person in need of assistance, and other drivers on the road. With one product for one person, could we eliminate the effects of three other parties?
This problem calls attention to creating equitable situational awareness while driving, specifically amongst emergency vehicles
How might we design an assistive device that helps deaf drivers not only detect emergency vehicles before seeing them but also understand their direction quickly and intuitively?
User Research | Key Findings |
|---|---|
We interviewed 4 drivers (3 hard-of-hearing, 1 hearing control) across age groups to understand:
| Hearing aids alone are insufficient for spatial awareness.
|
For drivers who have hearing disabilities, detecting the direction of an emergency vehicle can be challenging. With sirens granting you key context into the direction of the incoming emergency vehicle, missing this audible warning can lead to a delay in emergency response time and potential threat to those hard of hearing.
Four parties are thus affected by this: the DHH driver, the emergency vehicle driver, the person in need of assistance, and other drivers on the road. With one product for one person, could we eliminate the effects of three other parties?
This problem calls attention to creating equitable situational awareness while driving, specifically amongst emergency vehicles
How might we design an assistive device that helps deaf drivers not only detect emergency vehicles before seeing them but also understand their direction quickly and intuitively?
User Research | Key Findings |
|---|---|
We interviewed 4 drivers (3 hard-of-hearing, 1 hearing control) across age groups to understand:
| Hearing aids alone are insufficient for spatial awareness.
|





We went through several ideations on how we should mount this device on a vehicle. Some included a steering wheel mount, a light indicator as a part of the steering wheel, a dashboard mount, and a stiff mount behind the rearview mirror.
We needed to think about where this device could sit that would not obstruct the vision of the driver, would be a universally accessible point, and would be most straightforward to understand.
A steering wheel option is not the most viable since it is constantly behind rotated so the directional element would be unreliable. A dash mount could work, but a lot of cars have differing anatomies.
Something that is consistent in all cars in the rearview mirror so we went with that mounting option. The mount would be stiff to prevent any swinging from motion.
We went through several ideations on how we should mount this device on a vehicle. Some included a steering wheel mount, a light indicator as a part of the steering wheel, a dashboard mount, and a stiff mount behind the rearview mirror.
We needed to think about where this device could sit that would not obstruct the vision of the driver, would be a universally accessible point, and would be most straightforward to understand.
A steering wheel option is not the most viable since it is constantly behind rotated so the directional element would be unreliable. A dash mount could work, but a lot of cars have differing anatomies.
Something that is consistent in all cars in the rearview mirror so we went with that mounting option. The mount would be stiff to prevent any swinging from motion.




design.
While sound-recognition technology is a solved problem in the tech industry, the delivery of that information in a high-stress driving environment is not. Our study focused on the ergonomics of the rearview mirror placement and the cognitive load of the directional LED indicators.
This project successfully validated the visual interface and its impact on driver reaction times, even while using a controlled simulation. The main limitation was the lack of an integrated acoustic sensor array to automate sound detection in real-world traffic. Future iterations would move beyond the "Wizard of Oz" testing phase by implementing machine learning to isolate and classify emergency sirens amidst ambient road noise.
While sound-recognition technology is a solved problem in the tech industry, the delivery of that information in a high-stress driving environment is not. Our study focused on the ergonomics of the rearview mirror placement and the cognitive load of the directional LED indicators.
This project successfully validated the visual interface and its impact on driver reaction times, even while using a controlled simulation. The main limitation was the lack of an integrated acoustic sensor array to automate sound detection in real-world traffic. Future iterations would move beyond the "Wizard of Oz" testing phase by implementing machine learning to isolate and classify emergency sirens amidst ambient road noise.





