Duration
Fall 2019 - Dec 2019
My Role
UX/UI Designer
Team
Cathy Xu
Initial Goal
How might we help people who are hard of hearing and hearing to have conversations more easily?
I want to explore how users living with hearing impairment use assistive technology for two major reasons.
2.5 billion
people will have some degree of hearing loss by 2050
Many countries are facing issues associated with increased longevity and an aging population. An aging population means that increasing accessibility in the digital world will become more and more important.
3 family members
are currently experiencing hearing loss
Personally, I want to learn more about lived experience with hearing impairments because some of my family members are going through age related hearing loss.
Research Methods
Problem Discovery
I used a combination of formal and informal research methods to gain insights into the deaf and hard of hearing community and existing communication tools other than sign language.
Online Research
I reached out on Reddit communities under the r/deaf and r/HOH (Hard of Hearing) to understand how people who have lived experience with hearing loss communicate with people in their lives who are hearing.
User Interview
For this preliminary round of research, i conducted the interviews with one research question in mind: “how does hearing people communicate with hard of hearing people?” My intention is to narrow down the problem scope.
Findings
For people who are slowly losing hearing later in life, the transition into communicating with a majority hearing social circle can be challenging
"I am hearing, with a grandpa who is slowly losing his hearing and refuses to wear hearing aids! Whenever I talk to him I make sure I am looking directly at him and nothing covering my mouth."
For people who are slowly losing hearing later in life, transition period into learning sign language or adapting to hearing aids is a difficult process.
"I don’t want him[hoh] to feel excluded from our conversation. He’s always asking what are we talking about and I feel bad that he can’t participate like he used to. ”
The support from family and friends makes the transition easier. However, unable to participate in conversations makes socialization harder.
“After her hearing loss become more significant, she becomes more reserved. We are learning sign language together but the transition is really tough on her."
Engage in group conversations pose more challenges due to loud background noise and multiple person talking at the same time.
Persona
Contextualize using targeted-scope persona
A persona based on previous user research represents the feeling of isolation and embarrassment from the newly hard of hearing.
People who suffer from hearing loss later in life
Since I recruited participants for interviews from Reddit, the following insights are skewed towards Reddit users who are mainly younger generations from the U.S. These participants share their experience with older adults in their family circle that suffers from age related hearing loss.
The feeling of isolation during the transition
The hearing loss community is a big population with drastically different lived experienced. Users need a way to easy into the process of adapting to hearing aids or learning sign language since this is a difficult transition.
Pivot
How might we help those who are losing their hearing later in their lives to communicate with family and friends who are hearing?
Before conducting user interviews, my assumption and hypothetical solution is a digital tool that translates American Sign Language(ASL) to speech. After talking to community members, I narrowed down the scope and target population. I want to focus on users who are newly hard of hearing rather than the general hearing impairment community.
Pain Points
1. Challenge to understand what people are saying
2. Unable to detect who is speaking
User Goals
1. I can see the transcript of the conversation in real time
2.I can easily distinguish the speaker from the transcript
Wireframe & Flow Chart

A speech to text transcription app

With group chat functionalities that allow users to follow conversations with multiple speakers. Based on the persona, the app needs to be accessible for older users that might not be tech savvy. That’s why the design of the app follow the minimal viable product strategy to keep the functionalities simple.

Iterations

Home page with clear CTA to start a transcription

1. Unclear group members
Users can’t tell exactly how big the group chat by looking at the initials on the thumbnail picture.
2. Confusing CTA on bottom navigation bar
Users are confused about what the Start Chat button on the bottom navigation leads to. Most users who are older than 50 thinks the button transform the current transcript page into a recording page.

Transcribe with speaker detection and large font text

1. Add contact from chat
One pain points from the usability testing is that older users struggle to add contacts due to low hand dexterity. Thus, the speaker from a chat will be automatically added to the contact list.
2. Automatic speaker detection
Another major struggle is for users to tell who is speaking when engaged in a multi-person conversation. The design added easily identifiable icons and colors to reduce cognition load.
3. Enlarge text size for readability
5 out of the 6 participants reported that the the font size and faded gray color is hard to read when the people talk faster than they can read. Thus, the font size change from 12 pt to 22 pt with a darker black to increase contrast.
4. Clear status indicator
One user story is when transcribing a conversation in a noisy area, user wants to stop transcribing when the conversation fizzle down. Thus, a clear status indicator with a start and pause button is added to the thumb zone.
Final Design

Transcribe with speaker detection and large font text

a mobile transcription app that provides chat like experiences for the newly deaf or hard of hearing
Reflection

What I learned

For the future, I would love to explore more of the advance transcribe features, since that wasn’t a huge priority for the MVP. I think that adding some sort of text to speech feature or play a sound to draw people’s attention would allow the hard of hearing user to participate in the conversation.

I would like to conduct a round of comprehensive accessibility audit based on the WCAG guidelines to ensure people with disabilities other than hearing impairment can also use the app