Video-sharing site YouTube’s most subscribed-to channel, PewDiePie, has reached the milestone of registering 351m viewers in June of this year alone.
Dublin: 30.07.2014 06.12PM
Engineers at the University of Washington are testing a tool that uses motion detection to identify sign language.
The tool, called Mobile ASL, aims to be able to transmit clear video images of American Sign Language over US networks while keeping battery use and data usage low.
"We want to deliver affordable, reliable ASL (American Sign Language) on as many devices as possible," said Eve Riskin, a UW professor of electrical engineering.
"It's a question of equal access to mobile communication technology."
While phones like the iPhone 4 and the HTC Evo offer video calls, many broadband companies have blocked heavy bandwidth conferencing and have brought in tiered pricing plans for excess data use.
This new tool aims to solve these issues by optimising compressed video signals for sign language. They have increased the image quality around the face and hands, which reduces the data rate to 30KB per second.
The motion-detecting technology can identify whether or not the caller is using sign language, so as to reduce the amount of battery life being used.
The team estimates that their technology uses 10 times less bandwidth than iPhone 4’s Facetime.
MobileASL can be used on any phone that has a camera on the same side of the phone as the screen, broadening the user base extensively.
The engineers are currently testing MobileASL with 11 students, each one considered to be an academically gifted, deaf or hard of hearing pupil, who wish to pursue a career in computing.
While the team will conduct a larger study in winter, we are already seeing what the benefits of this new technology are.
"It is good for fast communication," said Tong Song, a Chinese national who is studying at Gallaudet University in Washington, D.C.
"Texting sometimes is very slow, because you send the message and you're not sure that the person is going to get it right away.
“If you're using this kind of phone then you're either able to get in touch with the person or not right away, and you can save a lot of time," said Song.
The study so far has shown that out of 200 phone calls, the average call lasted 90 seconds long. However, the team wishes to expand this testing period in order to analyse how MobileASL could be used in real life scenarios.
"We know these phones work in a lab setting, but conditions are different in people's everyday lives," Riskin says.
"The field study is an important step toward putting this technology into practice."