Deaf-Mute Communication – Part II: Background Research

This part concludes the project report for the project “Deaf-Mute Communication”. Previous posts here: Part I, Part III.

Background Research

Sign Language

Just like spoken languages, sign language is built upon certain rules of grammar, and can also be varied in terms of dialects. But unlike spoken languages, dialects in sign language are not as comprehensible between people using related dialects. For example, British sign language is almost unintelligible for users of American sign language. On the other hand, American sign language has a similarity of 60% to modern French sign language, which demonstrates the difference in relationship and affinity between different sign languages and the equivalent spoken languages.

Moreover, what should be emphasized in this context is that many signs are very similar to each other, which can make the distinction hard to perceive. Examples of similar signs in American sign language are:

Technology

To bridge the communication gap between deaf-mute people and the people in their surroundings, there are some current methods put in use. Below is presented some of the technologies that are used to carry out these methods together with technologies with the potential of being used in future communication solutions.

Augmentative and Alternative Communication (AAC) is an umbrella term for all types of communication enhancing methods used by people (except oral communication) to express themselves to others. This includes body language and facial expressions. AAC  is further categorized in Unaided communication systems, where the user’s body is relied on for communication, and Aided communication systems, where devices ranging from pencils to computers that produce voice output are utilized in addition to the user’s body.

Different kinds of relay services are utilized to help deaf-mute people communicate with hearing people. Video Remote Interpreting (VRI) and Video Relay Service (VRS) are two similar services, where VRI is used for communication between people at the same location, and VRS is used to interpret messages between people at different locations. Both services rely on Internet and video communications technology, since the interpreters are never located at the same location as the people using the service. Text Relay Service (TRS) is similar to VRS, with the difference that the output to the person with hearing impairments is text instead of video. Keyboards or special assistive devices are used to send text messages to standard telephones via the telephone line. IP Relay Services are web-based, similar to chats, and do not rely on telephones at all. Thus, callers have to manually supply the operators with their location information during situation such as making emergency calls.

The Leap Motion is a sensor device that monitors hand and finger motion in order to use these as input to a computer, e.g. to control different kinds of interfaces. The device was initially created to overcome the cumbersome process of 3D modelling with mouse and keyboard as input devices, but is currently used within a large area of use, such as controlling computer games, web browsers and virtual musical instruments. Recent attempts have also been made to use the Leap Motion as a gesture-based sign translator for online chat applications.

The Kinect is also a motion sensing device, but unlike Leap Motion, Kinect monitors full-body motion. Usually the sensor device is positioned on top of a monitor, which displays the interface of whatever is being controlled with the sensor device. Recent efforts have been made to build a sign language to text/text to sign language translator using the Kinect for sign language input.

Machine translation is a subfield of computational linguistics that investigates how to translate text or speech from one natural language to another. Great efforts are currently being put into making results produced through machine translation more accurate. Corpus linguistics and statistical techniques are utilized to be able to recognize whole phrases of text or speech instead of the single words by themselves.

Interview

We conducted an interview with a man who is hard of hearing and who uses American sign language. The questions were focused on which means of communication that deaf-mute people utilize when communicating with hearing people or with people who do not know sign language. We learned that assistive tools such as pen and paper, computers and text relay services were used to lower the communication threshold. The conclusion was that although the mentioned tools are slow to use, they are valuable and better than nothing.

Leave a comment