Google on Tuesday announced that the Google Lens feature, which was introduced at I/O 2017, will be making its way to to Google Assistant on Pixel phones over the coming weeks. The rollout is four six countries, namely, India, US, UK, Australia, Canada, and Singapore. The feature was first spotted rolling out earlier this month.
The Google Lens will be available to users within the Assistant and can be used on Pixel smartphones which are set to English language. The feature was previously available only in Google Photos, and after Assistant, is expected to make its way into the camera and other apps.
Users can start using Google Lens in Google Assistant by tapping Lens icon in the bottom right corner. Essentially, Google Lens is a set of vision based computing capabilities that can understand what a user is looking at and take actions based on that information. Google has already claimed that the feature will work with places like restaurants, where posting your phone to it will allow Google Lens to bring up relevant information on it including ratings and reviews.
Google explains that users can use the Google Lens alongside Assistant for saving text information on a business card, follow URLs, call phone numbers, and navigate to addresses. It can also help users find and recognise landmarks by exploring new city with the help of Assistant. Google Lens can also give information about a piece of art, a book, or a movie by just pointing the camera using Lens feature. With Google Lens, users can also look up products by barcode or scan QR codes. For those unaware, Google Lens is also available in Google Photos which means users get contextual information about what's in the photo.