EARIE

#Product design

#User Experience design

Contrary to popular belief, a large proportion of hearing-impaired people are not completely deaf and can understand conversation based on other people's mouth movements. Therefore, many hearing-impaired people will use other senses, such as sight and touch, to make up for the lack of hearing.

However, in some scenes, such as the speaker is not in the visual range of the hearing impaired, or the sound source cannot be perceived by vision and touch, such as car honking or stereo, the hearing impaired still face various inconveniences.

Hearing impaired people are used to feeling sound based on vibration because they have been in a silent environment for a long time. Therefore, in this project, I will design an interactive wearable device to help them overcome the above problems by using the sense of touch.

Background Research

Considering serious SHL does not have a definite curing method and its degree is ultimately severe, this project is aiming to help those people

Case study

It’s for learning how inconvenient the daily life is to severe hearing impaired people

User Journey Map

Persona

Brainstorming

Technology development

Part 1. Sound source detection

1.A Sensor spacial design

1.B Communication process design

1.C Vibration Interface design

Part 2. Hand gesture detection

2.A Camera & screen spacial design

2.B Communication process design

Iterations

Final Showcase

Now my App cannot receive the bluetooth signal, so here I use a third party App to receive my data.

Chinese letter "你“

Chinese letter "好“

Now my PyTorch model cannot distribute on mobile now, but it runs well on PC, I am still working on it

01 Gather Soundwave

The device gathers sound with four microphones, those sound waves have delays with each other. The sound wave will be transmitted through bluetooth. The delayed time will be calculated by dev board and transmitted.

02 Processing & Locating

On the phone, the sound will be processed by Alibaba Cloud Service, return a signal if a key word is mentioned or a very loud sound occurs,

The delayed time from the dev board will be processed by my TDOA App, then iPhone will transmit the coordinate of the sound source to the dev board.

03 Vibration Responding

The dev board will process the coordinate and notice the user of the direction of the sound source with vibrations

04 Gesture Recognition

The device takes a picture and transmits it to iPhone every 500ms.

Then the PyTorch distributed on the phone will recognize the sign and transmit the string back

05 Gesture Recognition

The dev board on the device receives the string and will display the letter on the screen

References

Reference 01: https://medium.com/shidanqing/swift%E7%9F%A9%E9%98%B5%E4%B8%8E%E5%90%91%E9%87%8F%E8%BF%90%E7%AE%97-669a4973d2e

Reference 02: https://mdpi-res.com/d_attachment/electronics/electronics-11-00890/article_deploy/electronics-11-00890.pdf?version=1647076489