Jul 302012
 

Practical and accessible? Not quite.

Inspired by the Google’s Project Glass, computer programmer Will Powell has built a prototype real-time translation system that listens to speech, translates it into one of 37 languages, and then displays the resulting text as subtitles directly onto the user’s glasses.

In a nutshell, here’s how it all works. A Bluetooth microphone picks up the audio signal and connects to a smartphone or tablet to provide a clean, noise-cancelled audio feed. The signal is then sent to the Microsoft Translator service, which detects the foreign language and transcribes it into the target language of choice. Finally, the translated text is displayed on the lower half of the glasses – effectively providing real-time subtitles for a conversation in a foreign language.

The subtitle interface and the pictured TV display (a non-essential component used in a demo to display the text of the entire conversation) is powered by twoRaspberry Pi units – credit card sized, single-board computers that retail for US$35 each.

The resulting system appears quite responsive, although it may still be a little too slow for real-life applications. Most of the delay in the subtitles is caused by the server time needed to process the information and, says Powell, caching the most common expressions has somewhat mitigated the problem, but not solved it.

Nifty? Yes. Practical and accessible? Not quite.

Read more . . .

via Gizmag – Dario Borghino
 

The Latest Streaming News: Augmented reality glasses updated minute-by-minute

Bookmark this page and come back often
 

Latest NEWS

 

Latest VIDEO

 

The Latest from the BLOGOSPHERE

Other Interesting Posts

Leave a Reply

%d bloggers like this: