Google Assistant is getting smarter. While the digital assistant has traditionally only used the microphone to hear, now it’ll also use the phone’s camera to see. That’s thanks to Google Lens, which, after some testing, is now rolling out to all users of Google Pixel phones.

The news was announced by Google through a blog post, and while expected, it is exciting. Google Lens promises to apply Google’s machine learning expertise to what the phone can see through a camera. Lens was first announced at Google I/O in May.

“Looking at a landmark and not sure what it is? Interested in learning more about a movie as you stroll by the poster? With Google Lens and your Google Assistant, you now have a helpful sidekick to tell you more about what’s around you, right on your Pixel,” said Google in its blog post.

That will manifest in a number of different ways. Previously, Google Lens was available through Google Photos, but it involved users having to take a photo, then switch apps and hit the Lens button. Lens on Google Assistant promises to not only be more intuitive, but also smarter. According to Google, the feature will allow users to do things like save information from a photo of a business card, follow links, and recognize objects. You can also do things like point lens at a movie poster for information about the movie, or at landmarks like the Eiffel Tower to learn more about it and its history. Last but not least, Assistant can look up products through bar codes.

Of course, we’ll have to wait and see how it all works once it’s rolled out, but the good thing about Google Lens is that it doesn’t really rely on a great camera — it’s more dependent on software, so it can be updated and improved over time.

Google Lens is currently rolling out to Pixel phones in the U.S., U.K., Australia, Canada, India, and Singapore. Google says it will roll out “over coming weeks.” When it is finally available on your phone, you’ll see the Google Lens logo at the bottom right-hand corner of your screen after you activate Google Assistant.

Related Posts

Apple could fold Siri into a dedicated app with a big makeover

The new version is expected to debut at WWDC on June 8 alongside iOS 27 and macOS 27. Apple’s goal is to turn Siri from a basic assistant into a deeply integrated AI agent that works across apps and understands your workflow.

WWDC 2026: Everything we expect from Apple’s June event 

However, alongside the yearly operating system refresh, the event also has the responsibility of revealing Apple’s advancements in AI. Unlike last year, the company might also showcase some new hardware (and the important ones no less), making it even more interesting.

Apple is eyeing an “Ask Siri” feature that puts the assistant everywhere on your iPhone

According to Bloomberg, the company is testing a version of Siri that could live across your entire iPhone, helping you inside apps, messages, and even system features.