Griffin on Tech: Are we ready for Google Glass 2.0?
Google's I/O developers conference took place this week as its first physical conference in two years with the usual slew of updates to its Pixel line of devices that are still not available on the New Zealand market.
But buried away in the presentation was a tantalising video that suggests Google is gearing up to have another crack at a new format of technology - (AR) augmented reality glasses. Google's CEO Sundar Pichai teased a video of high-tech smart glasses that translate languages in real-time, with a display overlaying the lens of the glasses showing the translated text. There's no camera visible on the glasses, but clearly a microphone is required to pick up a person's voice so it can be translated.
This is a visual version of the language translation feature Google previously built into its Pixel earbuds. The thick-rimmed glasses shown off this week are still in the prototype phase and Google hasn't announced when we might expect to see them on the market.
But the low-key video suggests Google is ready for a second attempt to socialise the idea of us wearing augmented reality glasses that do more than just translate conversations for, but give us a digital overlay on the world.
Google's language translation glasses
After all, most of Google's rivals are aggressively pursuing AR wearables. Apple is widely rumoured to have AR glasses in development and Meta, through its Oculus division, has a number of AR and virtual reality devices in the works as it attempts to lead us into the virtual world of the metaverse.
This week, CEO Mark Zuckerberg showed off Project Cambria, a prototype of a mixed reality headset that lets wearers interact with virtual objects overlaid onto their real-life environments.
Zuckerberg is going all-in on such technologies as the interfaces to his metaverse that go beyond the 2D screens of our phones and computers. Whether we are ready and willing for that, given Meta's track record with Facebook, is another question entirely.
Google has proceeded into the mixed reality world much more cautiously since it's co-founder Sergey Brin unveiled Google Glass at I/O in 2013. It was a product ahead of its time, launched with a flamboyant student that saw skydivers wearing glasses jump out of a plane over the conference venue, using the cameras embedded in Google Glass to record their view of the world.
It was very flashy and impressive. But then Google Glass crashlanded to reality. The hardware was glitchy and had a narrow range of uses. At US$1,500 for a pair, they were too expensive. But the real killer was that camera, which it soon became obvious, could be used by the wearer to discretely record everything in front of them. With a wink of an eye, you could take a photo. Even though a visible prism lit up when a Glass user was videoing or photographing, the feature led to a backlash.
Owners wearing the distinctive glasses were dubbed "Glassholes" and restaurants asked them to remove them. A device that appeared to be on the cusp of becoming a mainstream replacement for the mobile phone instead retreated to the sidelines as a novelty device for AR developers to tinker with.
Google has gone on to do some very clever stuff in the AR space with the Google Lens feature of smartphones. The reality is that the camera which caused such consternation about Google Glass is a crucial feature for interpreting the world around us and making a pair of AR glasses work. Imagine being able to just glance at the Embassy Theatre as you walk by and get an overlay of the movies currently showing.
I/O also saw the debut of a 'near me' multi-search query that would seem to be well-suited to augmented reality glasses. Using it, you can take a photo of an object with your phone camera and find out where you can get it locally. Eventually, that could be done just by looking at it through the lenses of a pair of AR glasses.
But Google is clearly taking a slow and steady approach to its Glass successor, fearing another wave of negative sentiment. The world has moved on a lot in the decade since Google Glass. We are more aware of the fact that we can be recorded wherever we are. More fundamentally, many of us are looking for a better experience than the limited form factor of the smartphone screen, which hasn't fundamentally evolved in a decade, apart from becoming foldable.
I can see AR glasses breaking through in niche areas on the way to mainstream adoption. My 89 year-old father in law is deaf and no hearing aids are going to fix that. Not being able to participate in conversations is incredibly frustrating for him. If he could get subtitles appearing on his glasses it would radically improve his sense of well-being.
So Google has a genuine opportunity to improve people's lives with this technology. But Google will need to proceed carefully to avoid the perception that AR glasses represent just a more immediate way to gather data points from us to fine-tune its digital advertising algorithms.
You must be logged in in order to post comments. Log In
Interesting such glasses. Especially for folk who don't need any. Can the ghee-whizz stuff be projected onto existing glasses, or prescription lenses fitted to Google Glass(es) 2.0? Journalists should ask such questions too, methinks. As you said, the impaired may well have a need for augmented information.