At I/O 2019 earlierier this month, Google announced that it will bring new “filters” to Google Lens. The new filters are designed to make it easier to point at a restaurant menu and get recommendations on the most popular dishes. Now, these new features announced for Google Lens are making their way to some users. The feature is being combined with a revamp for the user interface. This revamped version of Google Lens can be accessed via Google Assistant or Google Photos if you have received the new features.
The redesigned Google Lens comes with a total of five modes – Auto, Translate, Text, Shopping and Dining. Google Lens users can scroll through to get refined detection and look for specific search results. The first new option, according to AndroidPolice, is similar to the previous version, where Lens would automatically recognize what is in the camera view and detect objects. The Translate and Text options are self explanatory where it lets you translate or copy text on the screen.
The last two options are new additions, and they will come in handy at a shopping mall or restaurants. Google Lens users can scan the barcode and it will automatically recognize items like clothing or furniture and users can then look them up online. The Dining mode will go through the menu and provide recommendations. It can also be used to split the bill when you are at a restaurant.
Watch: Android Q First Look
These new modes within Google Lens appear in the form of a carousel at the bottom of the screen. There is a shutter button to capture or start automatic image recognition. There is also an icon at the top-right corner that allows users to pick an existing image from your smartphone’s photo gallery. Another change includes option to specify which portion of the screen or image do you want the Lens to analyze. This revamped Lens interface is being rolled out as part of Google app update version 9.91 beta and it is already being pushed to Google’s Pixel and Samsung.