Wearable Sensors Could Translate Sign Language Into English

Wearable Sensor Translates Sign Language
These wearable sensors can sense movement and muscle activity. The information is then sent to a program that can translate sign language gestures into English. (Image credit: Texas A&M)

Wearable sensors could one day interpret the gestures in sign language and translate them into English, providing a high-tech solution to communication problems between deaf people and those who don’t understand sign language.

Engineers at Texas A&M University are developing a wearable device that can sense movement and muscle activity in a person's arms.

The device works by figuring out the gestures a person is making by using two distinct sensors: one that responds to the motion of the wrist and the other to the muscular movements in the arm. A program then wirelessly receives this information and converts the data into the English translation. [Top 10 Inventions that Changed the World]

After some initial research, the engineers found that there were devices that attempted to translate sign language into text, but they were not as intricate in their designs.

"Most of the technology ... was based on vision- or camera-based solutions," said study lead researcher Roozbeh Jafari, an associate professor of biomedical engineering at Texas A&M.

These existing designs, Jafari said, are not sufficient, because often when someone is talking with sign language, they are using hand gestures combined with specific finger movements.

"I thought maybe we should look into combining motion sensors and muscle activation," Jafari told Live Science. "And the idea here was to build a wearable device."

The researchers built a prototype system that can recognize words that people use most commonly in their daily conversations. Jafari said that once the team starts expanding the program, the engineers will include more words that are less frequently used, in order to build up a more substantial vocabulary.

One drawback of the prototype is that the system has to be "trained" to respond to each individual that wears the device, Jafari said. This training process involves asking the user to essentially repeat or do each hand gesture a couple of times, which can take up to 30 minutes to complete.

"If I'm wearing it and you're wearing it — our bodies are different … our muscle structures are different," Jafari said.

But, Jafari thinks the issue is largely the result of time constraints the team faced in building the prototype. It took two graduate students just two weeks to build the device, so Jafari said he is confident that the device will become more advanced during the next steps of development.

The researchers plan to reduce the training time of the device, or even eliminate it altogether, so that the wearable device responds automatically to the user. Jafari also wants to improve the effectiveness of the system's sensors so that the device will be more useful in real-life conversations. Currently, when a person gestures in sign language, the device can only read words one at a time.

This, however, is not how people speak. "When we're speaking, we put all the words in one sentence," Jafari said. "The transition from one word to another word is seamless and it's actually immediate."

"We need to build signal-processing techniques that would help us to identify and understand a complete sentence," he added.

Jafari's ultimate vision is to use new technology, such as the wearable sensor, to develop innovative user interfaces between humans and computers.

For instance, people are already comfortable with using keyboards to issue commands to electronic devices, but Jafari thinks typing on devices like smartwatches is not practical because they tend to have  small screens.

"We need to have a new user interface (UI) and a UI modality that helps us to communicate with these devices," he said. "Devices like [the wearable sensor] might help us to get there. It might essentially be the right step in the right direction."

Jafari presented this research at the Institute of Electrical and Electronics Engineers (IEEE) 12th Annual Body Sensor Networks Conference in June.

Follow Live Science @livescience, Facebook & Google+. Original article on Live Science.

Live Science Contributor