Like a pair of sneakers someone’s wearing? Or maybe a dress? There are quite a few apps and services — like Amazon’s Firefly or Samsung’s Bixby Vision — that let you simply point your smartphone camera at the object and search for it, or similar styles. Google is following suit with a similar feature in Google Lens, but it has the potential to reach far more people.

Google Lens is currently built into the Google Assistant on Android phones, as well as Google Photos. It lets you point the smartphone camera at objects to identify them, teach you more about landmarks, recognize QR codes, pull contact information from business cards, and more. At its annual Google I/O developer conference, the search giant announced four new improvements to Lens, and we got to try it out.

Google Lens is now built into the camera app on phones from 10 manufacturers: LG, Motorola, Xiaomi, Sony, Nokia, Transsion, TCL, OnePlus, BQ, Asus. That is not including Google’s very own Google Pixel 2. You are still able to access it through Google Assistant on all Android phones.

We got a chance to try it out on the recently announced LG G7 ThinQ, and the new option sits right next to the phone’s Portrait Mode.

The biggest addition to Lens in this I/O announcement is Style Match. Like Bixby Vision or Amazon Firefly, you can point the smartphone camera at certain objects to find similar items. We pointed it at a few dresses and shoes, and were able to find similar-looking items, if not the exact same item. Once you find what you’re looking for, you can purchase it if available directly through Google Shopping.

It’s relatively quick, and an easy way to find things you can’t quite write into the Google Search bar.

Perhaps even more useful is Smart Text Selection. Point Google Lens at text, say like from a book or a menu, and it can single out the text from everything else. You can then tap on the text and copy it or translate it. When we tried it, Lens managed to grab an entire three paragraphs of text, though we’d have to do more testing to see how well it can pick up handwritten text.

Google Lens now works in real time, so you don’t need to pause and take a photo for it to understand the subject. That means you can point it at several things and you will see it creating colored dots on the objects it grabs information for. Google said it is identifying billions of words, phrases, and things in a split second all thanks to “state-of-the-art machine learning, on-device intelligence, and cloud TPUs.”

Google said it will be rolling out all of these features toward the end of May.

Related Posts

OnePlus 15T leak spills details on a curious camera situation

According to the Chinese tipster Digital Chat Station (via Weibo), a "small-screen phone powered by the Snapdragon 8E5 is ready," translated from simplified Chinese. This phone, believed to be the OnePlus 15T, could feature a dual-camera setup "with a 50MP main sensor and a 50MP telephoto lens."

WhatsApp has begun testing a long-overdue group chat feature

The Meta-owned messaging platform is testing a new feature called "group chat history sharing" (via a WABetaInfo report). As the name suggests, the feature lets a WhatsApp user (likely the admin) share the chat history (up to 100 messages sent within 14 days) with someone while adding them to a group.

Google Photos introduces a fun new way to turn yourself into a meme

According to a recent post on Google's support forums, Me Meme is a generative AI feature that lets you star in trending memes using a template and a photo of yourself. It's rolling out in Google Photos for Android in the US, and you can try it out by tapping the "Create" button and selecting the new "Me meme" option.