AR App Translates Sign Language In Real Time

AR continues to break down the language barrier as these NYU students introduce a new form of augmented sign language translation.
 
If the last decade of sci-fi movies have taught us anything it’s that the near future is going to be full of futuristic language translating earbuds, implants, or some other form of high-tech speech conversion.
 
And while we still might be a couple years off from hands-free, zero effort translation, three NYU students have in the mean time managed to utilize current smartphone technology to bring real-time translation to one of the most overlooked forms of communication: sign language.

,

,

“We make magic when we pair leading students with outstanding mentors in the Envrmnt team at our AR/VR lab,” said Christian Egeler, Director of XR Product Development for Envrmnt, Verizon’s platform for Extended Reality solutions in a statement. “We discover the next generation of talent when we engage them in leading edge projects in real time, building the technologies of tomorrow.”
 
“NYC Media Lab is grateful for the opportunity to connect Verizon with technically and creatively talented faculty and students across NYC’s universities” stated Justin Hendrix, Executive Director of the NYC Media Lab. “We are thrilled to continue to advance prototyping in virtual and augmented reality and artificial intelligence. These themes continue to be key areas of focus for NYC Media Lab, especially with the development of the first publicly funded VR/AR Center, in which the Lab is developing in conjunction with NYU Tandon School of Engineering.”
 
No word yet on whether or not we’ll be seeing ASLR on Google Play or the App Store anytime in the near future, but the team has confirmed plans to pursue a commercial release at some point.

 

Source: VR scout

more insights