Skip to main content

Google Lens: Everything you need to know

Google Lens was one of the major announcements of the Google I/O 2017 keynote, and an important part of Google's Pixel 2 phone plans. For Google, a company with a long history in visual search, Lens is the latest step in an ongoing journey around computer vision. This is an endeavor which can be traced back to Google Image Search years ago, and which is a close relative of the AI powering Google Photos' object and scene recognition.

For the moment, Google is only talking about a "preview" of lens shipping on Pixel 2 phones. But as a part of Google Assistant, Google Lens has the potential to reach every Android phone or tablet on Marshmallow and up, letting these devices recognize objects, landmarks and other details visually (with a little help from your location data) and conjure up actionable information about them. For example, you might be able to identify a certain flower visually, then bring up info on it from Google's knowledge graph. Or it could scan a restaurant in the real world, and bring up reviews and photos from Google Maps. Or it could identify a phone number o a flier, or an SSID and password on the back of a Wi-Fi router.

Whether it's through a camera interface in Google Assistant, or after the fact through Google Photos, the strength of Lens — if it works as advertised — will be the accurate identification and the ability to provide useful info based on that. It's absolutely natural, then, that Lens should come baked into the camera app (and Photos itself) on the new Pixel 2 and Pixel 2 XL smartphones.

Big, BIG data

Like all the best Google solutions, Lens is a product of AI and data.

Like all the best Google solutions, Google Lens is rooted in big data. It's ideally suited to Google, with its vast reserves of visual information and growing cloud AI infrastructure. Doing this instantly on a smartphone is a step beyond running similar recognition patterns on an uploaded image via Google Image Search, but the principles are the same, and you can easily draw a straight line to Google Lens, starting with Image Search and going through the now-defunct Google Goggles.

Back in 2011, Google Goggles was impressive, futuristic and in the right setting, genuinely impressive. In addition to increased speed, Google Lens goes a step beyond this by not only identifying what it's looking at, but understanding it and connecting it to other things that Google knows about. It's easy to see how this might be extended over time, tying visible objects in photos to the information in your Google account.

This same intelligence lies at the heart of Google Clips, the new AI-equipped camera that knows when to take a photo based on composition, and what it's looking at — not unlike a human photographer. That all starts with understanding what you're looking at.

The potential for Google Lens is only going to grow as Google's capabilities in AI and big data increase.

At a more advanced level, Google's VPS (visual positioning system) builds on the foundations of Google Lens on Tango devices to pinpoint specific objects in the device's field of vision, like items on a store shelf. As mainstream phone cameras improve, and ARCore becomes more widely adopted, there's every chance VPS could eventually become a standard Lens feature, assuming your device hit a certain baseline for camera hardware.

What can Google Lens do on the Pixel 2?

Google is calling the version of Lens on Pixel 2 phones a "preview" for the time being, and it's obvious the company has ambitions for Lens far beyond its current implementation on these handsets. At the October 4, 2017 presentation, Google demonstrated identifying albums, movies and books based on their cover art, and pulling email addresses from a flyer advertisement.

Those are relatively simple tasks, but again, Google surely wants to start small, and avoid the pitfalls experienced by Samsung's Bixby service in its early days.

More: Google Pixel 2 preview

How is Google Lens different to Bixby Vision?

Google Lens

On the surface the two products might appear very similar — at least to begin with.

However, the potential for Google Lens is only going to grow as Google's capabilities in AI and computer vision become stronger. And the contrast with one of Samsung's most publicized features is pretty stark. The Korean firm is still a relative newcomer in AI, and that's reflected in the current weakness of Bixby Vision on the Galaxy S8 and Note 8.

Right now Bixby can help you identify wine (badly), as well as flowers (sometimes) and animals (to varying degrees of success) — as well as products, through Vivino, Pinterest and Amazon respectively. Samsung doesn't have its own mountain of data to fall back on, and so it has to rely on specific partnerships for various types of objects. (The service routinely tells you it's "still learning," as a caveat when you first set it up.)

What's more, while Samsung can (and apparently does plan to) bring Bixby to older phones via software updates, Google could conceivably flip the switch through Assistant and open the floodgates to everything running Android 6.0 and up. Both services are going to require some more work before that happens, though.

Nevertheless, anyone who's used Bixby Vision on a Galaxy phone can attest that it just doesn't work very well, and Google Lens seems like a much more elegant implementation. We don't yet know how well Lens will work in the real world, but if it's anywhere near as competent as Google Photos' image identification skills, it'll be something worth looking forward to.

We'll have more to say on Google Lens when we test it in more detail on the Pixel 2 phones.

Alex Dobie
Alex Dobie

Alex is global Executive Editor for Android Central, and is usually found in the UK. He has been blogging since before it was called that, and currently most of his time is spent leading video for AC, which involves pointing a camera at phones and speaking words at a microphone. He would just love to hear your thoughts at alex@androidcentral.com, or on the social things at @alexdobie.

27 Comments
  • I wonder if Google got this idea when they were helping Samsung with the development of Bixby.
  • Or when they released a similar product six years ago...
  • It's definitely an overhauled version of Goggles.
  • What makes you think Google helped Samsung develop Bixby?
  • I think you meant "I wonder if Samsung was disappointed to learn that Google didn't leave Goggles out to pasture when they thought they had time to perfect Bixby."
  • Lol
  • Is this going to be a separate app we can download or will it just be baked into assistant and photos
  • Baked in
  • Is this also going to be able to scan QR codes?
  • Bixby is a joke just like every other half baked software gimmick Samsung comes up with.
  • At least folks now have a reason to use the Bixby button after remapping.
  • Yep. Remapping makes that button super useful!
  • But slow.
  • If you rooted your GS8 you can remap the button on a system level... it's becomes faster than using the softbutton
  • You don't have to root to make it fast. I use bxActions in control mode and disable bixby home with adhell. Response is instant and I'm not rooted.
  • Not surprised that Google does better job at software and especially with this AI stuff. Samsung will always have a broken experience but probably doesn't matter for the regular people who buy Galaxy's.
  • Samsung is the king of hardware, and that's what people see. They'll never match Google software though.
  • So - so - true... Edit - gads - I just realized I replied to a post that is 5 months old... Sigh...
  • At least Bixby is actually USABLE in countries where the Galaxy S8. Knowing Google, this here AMAZING service will be "US only" at first, and then remain US only for MONTHS, and then it'll be released to some EU member nation, and then India in two year's time, and then stop rolling out completely.
  • *where the Galaxy S8 can be bought. I hate that the "Edit" button is not working on the mobile site.
  • Bixby is a joke. Does anyone actually use it?
  • Finally! I've been waiting years for a better, updated version of Goggles. :D
  • Given the degree of data promiscuity required, and the inability to opt out of its retention or to manage the access to it, this is another no sale. Even if you think you can trust Google not to do evil with the data collected, the reality is that they will disclose it to various government actors, many of whom are quite happy to do evil things.
  • Your tinfoil hat is on crooked...
  • Bixby Vision hasn't been that bad from my experience- honestly, Bixby as a whole hasn't been as bad as people make it out to be. It doesn't have the massive infrastructure in maps, it's own search engine, etc... that Google does, but for navigating the phone via voice it beats Assistant in every way. Using a combination of the two is the best way to go about it imo. That being said, because Lens has Google's search engine at a much deeper level than any other competitor, it'll probably function the best.
  • Interesting read I was thinking what was different between lens and Google googles instead but I guess bixby vision is a more current comparison
  • Give a look to Image Analysis Toolset. An artificial neural network based computer vision app less gimmick than Google Lens but more practical and more oriented do real daily use.