Google Lens, a machine learning-powered image analyzer, was first announced at I/O 2017. It currently exists in both Google Photos (to scan existing photos) and Assistant (to scan in real-time), but both methods have required a Pixel phone. Alongside some ARCore announcements, Google revealed that Lens would be rolling out to more Android devices and make an appearance on iOS.
Lens in Google Photos will soon be available to all English-language users on both Android and iOS. You’ll be able to scan your photos for landmarks and objects, no matter what platform you use. In addition, Lens in Assistant will start rolling out to “compatible flagship devices” over the coming weeks. The company says it will add support for more devices as time goes on.
Google Lens in Assistant
Beyond the expanded compatibility, Lens is receiving some functionality improvements. Google says Lens will soon be able to recognize common animals and plants, including specific dog breeds and flowers. You can find the original announcement from Google at the source link below.
The official Twitter account for Google Photos says Lens should show up for everyone within the next few days, as long as you have the latest version of the app:
Rolling out today, Android users can try Google Lens to do things like create a contact from a business card or get more info about a famous landmark. To start, make sure you have the latest version of the Google Photos app for Android: https://t.co/KCChxQG6Qm
Coming soon to iOS pic.twitter.com/FmX1ipvN62
— Google Photos (@googlephotos) March 5, 2018