27 Nov 2013

Microsoft Kinect Technology Used to Interpret Sign Language

Give a hand to Microsoft as tool translates sign language

Is there anything those brains at Microsoft can’t do? Not content with revolutionising computer technology and ushering in the new information age, they are now having a stab at bridging the gap between the deaf and those who can hear.

Having developed the hugely popular and ground-breaking console Kinect, Microsoft eggheads have been looking around for things to use its motion-sensing technology on outside of gaming. That is when they came up with the idea of developing a sign language tool that can translate sign into spoken language and back.

Guobin Wu, research programme manager for Microsoft Research Asia, said that preliminary tests show positive results when in translator mode, but the communication mode option is harder to crack.

“Our system is still a research prototype. It is progressing from recognising isolated words signed by a specific person (translator mode) to understanding continuous communication from any competent signer (communication mode),”  he explained.

Microsoft Research demonstrated the technology as it celebrated the 15th year of its Asia division.  It began collaborating on the project with the Chinese Academy of Sciences and Beijing Union University in February 2012. Eighteen months down the line and the Kinect translator can now recognise 370 of the most popular words in Chinese Sign Language. The team is also building up the system’s vocabulary of American Sign Language gestures, which are different from those of Chinese Sign Language.

Wu said: “Every month, we had a team meeting to review the progress and plan our next steps. Experts from a host of disciplines – language modelling, translation, computer vision, speech recognition, 3D modelling, and special education – contributed to the system design.”

Students from Beijing Union University helped the team collect and label Chinese Sign Language data during first six months of the project. Researchers at Microsoft say they hope to collaborate with more language experts, as well as conduct surveys with deaf people into how best to use the Kinect translator. Initial thoughts are that the device could help deaf users make presentations to non-sign language speaking crowds, while deaf users working in customer-facing positions will be able to more easily communicate with the public.

Wu said there are around 360 million people across the globe who have difficulty hearing. Due to this fact, the project has been receiving much attention from other researchers and the deaf community, especially in the United States.

“We expect that more and more researchers from different disciplines and different countries will collaboratively build on the prototype, so that the Kinect Sign Language Translator system will ultimately benefit the global community of those who are deaf or hard of hearing,” Wu said.

“I think it’s been great. In a year-and-a-half, we have already developed the system prototype,” Wu said. “The results have been published in key conferences, and other researchers have said the results are very good.”

It is not surprising that the first use this new technology is put to is translation. In the increasingly fast-paced age of mass communication, those who can’t bridge the language barrier are sadly left at a disadvantage. But technology and communication have always developed hand in hand, and only the future can say what advances will be made. Let us hope that with more projects like this, no-one will be left behind.



 
 

Sign up to our newsletter

Get our blog articles straight to your inbox.