Skip to main content

How to use Google Lens in Google Photos

Pixel 4a Alex Google Lens
Pixel 4a Alex Google Lens (Image credit: Alex Dobie / Android Central)

Google Lens has been an instrumental part of your camera for half a decade, but it's also been part of Google Photos for a few years, too. This means that even if there's no time to use Google Lens at the moment — such as when some weird-looking snake gets into the garden — you can come back and use Lens for its AI object identification, text transcription, and context recognition to figure things out later — like if that was a harmless rat snake you killed or a deadly copperhead.

Here's how to use Google Lens inside the Google Photos app on your phone.

How to use Google Lens in Google Photos

  1. Open the Google Photos app.
  2. Tap the photo you want to pull text or item info from.
  3. Tap the Lens icon in the bottom bar.
  4. Once Lens has analyzed the image, tap an object dot.Source: Android Central

  1. Swipe up to expand the details or context of the identified object.
  2. To copy text, tap any highlighted text string.
  3. Drag the blue highlighter ends to the beginning and end of the text you want to copy.
  4. Tap Copy text or Copy to computer.
  5. When you're done with Google Lens, tap the X in the top left corner.Source: Android Central

Google Lens can identify a wide variety of things, places, landmarks, and even people. It can identify millions of CD and vinyl albums, movie posters, flora, fauna, insects, animals, foods, and even what popular franchise that funny T-shirt is from. You can use it to transcribe text as well as translate text, and since it works with any photo you upload to Google Photos, this makes Google Photos an easy way to upload and transcribe your grandma's old 4x6 recipe cards.

The next time you see something you don't understand or recognize out in the real world, snap a pic and use Google Lens at your leisure rather than when you're standing right there blocking the entrance to that new Asian fusion restaurant. Of course, it always helps if your camera takes the best picture possible, so if your phone's camera isn't that great, you might want to invest in one of the best Android camera phones.

Ara Wagoner was a staff writer at Android Central. She themes phones and pokes YouTube Music with a stick. When she's not writing about cases, Chromebooks, or customization, she's wandering around Walt Disney World. If you see her without headphones, RUN. You can follow her on Twitter at @arawagco.

7 Comments
  • 9 out of 10 times I get "Hmm, not seeing this clearly". The one time it worked, it only recognized a work of art because the label text was also in the photo, which it selected instead.
    So, not very useful at the moment...
  • I used it a couple of weeks ago because a friend forgot what kind of plant she had planted. Took a picture of the leaf, hit the lens button and got the full details. Was similar to hydrangeas.
  • Hmmm. I'm still waiting for it on my Huawei Mate 9 Pro. 🤔
  • I took a photo of my Google Home Mini. Google Lens said it was a poppy seed bagle. 😒
  • Couldn't recognize an egg or keyboard. Called both "office equipment?" But I think it's because Google knows my location is the office
  • Lens could be better utilized as a specific app (or an optional eye/lens within the google search app) instead of using the Photos app. For example... you can use Google Translate for language interpretation which gives you live results utilizing the camera without having to actually take a pic/snapshot of what you're looking at. The processing power of translating text in from the camera live without having to actually take a pic and OCR it in the manner required by taking a pic for later processing is already there. It's already being done. Open google translate, select your From and To languages, select the camera, and everything your camera sees translates it live in real time without having to take a pic. The same could be done with images searches. Point your camera at an object, and get live links to what you're looking at without actually having to take a pic or snapshot. Dunno why this is something new and incredible when images being processed involved for language translation would take more processing power than identifying basic images.  
  • This is an article on how you can use Lens from Google Photos. I guess you are not aware that there is a separate Lens app that works the way you are talking about it.