Mobile Technology

Augmented reality glasses perform real-time language translation

View 7 Images
British computer programmer Will Powell has created a prototype real-time translation system that displays subtitles for the interlocutor's speech in a language of choice (Image: Will Powell)
Two Raspberry Pi computers (bottom) are used to power the subtitle interface and TV screen (Image: Will Powell)
This is all a user would need to carry to use this project - minus the camera on the glasses, which was used to record the demo. The rest is handled by remote servers, reducing user load (Image: Will Powell)
Anglo-spanish flags with the translation kit (Image: Will Powell)
A transcript of the demonstrative conversation is displayed on a TV screen (Image: Will Powell)
A transcript of the demonstrative conversation is displayed on a TV screen (Image: Will Powell)
British computer programmer Will Powell has created a prototype real-time translation system that displays subtitles for the interlocutor's speech in a language of choice (Image: Will Powell)
The Vuzix Star 1200 augmented reality glasses are showing the translated text as subtitles (Image: Will Powell)
View gallery - 7 images

Inspired by the Google's Project Glass, computer programmer Will Powell has built a prototype real-time translation system that listens to speech, translates it into one of 37 languages, and then displays the resulting text as subtitles directly onto the user's glasses.

In a nutshell, here's how it all works. A Bluetooth microphone picks up the audio signal and connects to a smartphone or tablet to provide a clean, noise-cancelled audio feed. The signal is then sent to the Microsoft Translator service, which detects the foreign language and transcribes it into the target language of choice. Finally, the translated text is displayed on the lower half of the glasses – effectively providing real-time subtitles for a conversation in a foreign language.

The subtitle interface and the pictured TV display (a non-essential component used in a demo to display the text of the entire conversation) is powered by two Raspberry Pi units - credit card sized, single-board computers that retail for US$35 each.

The Vuzix Star 1200 augmented reality glasses are showing the translated text as subtitles (Image: Will Powell)

The resulting system appears quite responsive, although it may still be a little too slow for real-life applications. Most of the delay in the subtitles is caused by the server time needed to process the information and, says Powell, caching the most common expressions has somewhat mitigated the problem, but not solved it.

Nifty? Yes. Practical and accessible? Not quite. The augmented reality Vuzix Star 1200 glasses that Powell used for this project – which are completely transparent, like a pilot's head-up display – retail for US$4,999. Moreover, the quality of the translation might be heavily affected without the interlocutor wearing a headset (or a second pair of glasses) to improve audio quality.

The video below is a short demonstration of the concept. The subtitles mirror the text appearing on the user's glasses.

Source: Will Powell

View gallery - 7 images
  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
2 comments
Michael Crumpton
or you could just have an app on your smartphone that does this. Both Apple and Android phones have done this for years.
ukrauskopf
great idea, why not spoken translation delivered through the ear buds?