Cell phone technology has been growing by leaps and bounds; smart phones and their apps have made significant impact on consumers. Smart phones have revolutionized the transfer of data through phone; be it voice, photo or video. Engineers at University of Washington (UW) have exploited the data transfer capability of cell phones furthermore to develop a system for deaf people that will enable them to use sign language to converse on phone.
UW engineers are working on a device to transmit American Sign Language over U.S. cellular networks. The tool is just completing its initial field test by participants in a UW summer program for deaf and hard-of-hearing students.
What are major limitations that UW Engineers have overcome?
The iPhones, Black Berrys, Androids, etc have video conferencing services, but these require high bandwidth. The MobileASL team, as these UW researchers are called, is working on
- optimizing compressed video signals for sign language
- increasing image quality around the face and hands, researchers have brought the data rate down to 30 kilobits per second
- algorithms to identify and transmit hand signals
- motion detection to identify whether a person is signing or not in order to extend the phones’ battery life during video use.
The basics is to transmit sign language as efficiently as possible, thereby increasing affordability, improving reliability on slower networks and extending battery life, even on devices that might have the capacity to deliver higher quality video.
278px" />Have many trials conducted on MobileASL?
The team has tested the product on 11 participants for three weeks in this summer. The participants were interviewed by the team and filled surveys, occasionally, regarding the quality of call or experience with the product.
How does it compare with smart phones?
It is a valid question as iPhone4, HTC EVo, etc offer video conferencing. But the problem is that
- broadband companies have blocked the bandwidth-hogging video conferencing from their networks
- high charges for heavy data users.
According to The UW team, MobileASL used 10 fold less bandwidth than iPhone’s FaceTime video conferencing service. iPhone is planning to release an iPhone app to transmit sign language, but limitation is that it works on iPhone 4 and requires very fast network speeds. The MobileASL system could be integrated with the iPhone 4, the HTC Evo, or any device that has a video camera on the same side as the screen. Read more at UW news.
The MobileASL is an innovative application for deaf people and offers advantages without consuming too much bandwidth, thereby making experience faster, less costly and more enjoyable. I am sure this money saving technology can be applied to many video conferencing applications.