New Google glasses to let you speak any language with real-time AI translation

The language barrier could become a thing of the past when you go on holiday, thanks to new Google glasses which translate what you hear in real-time.

During Google's I/O developer summit last night, CEO Sundar Pichai showed off a video demo of new augmented reality glasses which appear capable of live language translation for the wearer.

The video shows the glasses automatically generate captions in English as the wearer hears another language.

These appear on the lenses of the glasses so that you can see what other people are saying while they speak, in an advancement that would also help deaf people.

At the event, Pichai said: "Language is just so fundamental to connecting with one another, yet understanding someone who speaks a different language or trying to follow a conversation if you're deaf or hard of hearing can be a real challenge."

Google engineer Eddie Chung added: "What we're working on is technology that enables us to break down language barriers – taking years of research in Google Translate and bringing that to glasses."

Google has not confirmed whether the prototypes will ever be released to the general public, but the preview was a good indication of where Google wants to take its augmented reality technology.

  • Elon Musk's son has cute way of counting which he's picked up from Tesla CEO dad

It would not be the first time Google has released augmented reality wearables.

In 2014, the tech giant launched Google Glass, which enabled wearers to enjoy many augmented reality apps and Android functions through their glasses.

However, Google Glass didn't succeed commercially.

Other companies such as Meta (formerly Facebook) have been making inroads into the augmented reality market.

In September last year, Mark Zuckerberg's company announced its augmented reality Ray-Ban smart glasses.

Source: Read Full Article