What you need to know
- Google teased a new pair of smart glasses that will be able to live translate languages on the fly.
- These new glasses look just like a regular pair of glasses with no obvious "smart" styling that smart glasses users have become accustomed to.
- No release date or price has been announced just yet.
It's been a long time since Google Glass was officially launched, but it looks like the company is officially working on a brand new pair of smart glasses with Google Assistant baked right inside. Google teased the tech in that infamous "one more thing" moment at the end of its main Google I/O 2022 keynote address, showing off a pair of smart glasses that looks anything but what folks have come to expect. Google shied away from using the name Google Glass in the teaser and, if anything, these look a lot more like Focals by North, which makes sense since Google acquired that company nearly two years ago.
While there are many types of smart glasses on the market, Google's latest development could be a hybrid of what has come to be known as "smart assisted reality glasses" rather than a pair of full-fledged AR glasses. The difference here is that, at least from the teaser, it looks like Google's new pair of smart glasses could primarily be used for audio functions, including language translation via a built-in Google Assistant. Google also announced at I/O 2022 that Google Translate would support 24 additional languages, building excitement for the possibilities of these glasses.
In the video, Google showed off the glasses' ability to translate several different languages. That included traditional speaking languages like Chinese and Spanish, which can be used to help translate spoken language between customers, family, or while on an overseas trip, and even appeared to be helpful for folks who might be deaf or hard of hearing better understand people around them.
Not only did these glasses look more like a normal pair of glasses instead of a techy smart glasses look, but they also enabled the users to make full eye contact during speech instead of looking at a screen. While phones like the Pixel 6 can perform live translation via the onboard Tensor processor, you need to be looking at the smartphone display in order to see and understand the language you're attempting to translate. A pair of smart glasses like Google showed off could eliminate the unnatural barriers that exist with current technologies.
Do more with the Google Pixel 6 thanks to Google Tensor, the powerful chipset that powers the phone and enables impressive on-device language translation and a host of other AI-enhanced features.
Be an expert in 5 minutes
Get the latest news from Android Central, your trusted companion in the world of Android