Announced last year at I/O 2017, Google Lens is becoming a potent visual search tool in its own respect. At Google I/O 2018, the search giant announced new features coming to Google Lens including a new look and integration with the native camera application. In a nutshell, Google is bringing three new features to its visual search application. Also Read - Google Chrome sending Enhanced Safe Browsing notifications? Know all about it hereAlso Read - Pixel 6, Pixel 6 Pro are really happening and Google just teased them
With the update, Google Lens now updates the material theme with rounded look as many of Google’s other applications and services right now. The Lens will be integrated within camera on Pixel, LG G7 ThinQ and other smartphones. It also gets a persistent bar at the bottom that slides up whenever results are loaded and includes a microphone icon to the right. However, the three main features to Google Lens in summer are Smart Text Selection, Style match and Real-time results. Also Read - Nord 2 5G explosion fiasco: OnePlus releases statement following investigation
Smart Text Selection
With Google Lens, users can now not only look at things in real-time but can also initiate actions with them. For example, Google Lens can now scan a page containing words and then let you copy and paste those text to paste in some other application. While scanning, Lens will also surface relevant information and photos. “Say you re at a restaurant and see the name of a dish you don t recognize Lens will show you a picture to give you a better idea,” Google explains.
The second feature is called Style Match which allows Lens to find similar objects and offer matching objects and not certainly the exact thing that they are viewing. It will come handy in home decor or picking your next outfit.
Last but one of the key feature is real-time results which means users no longer need to specifically select an item and wait for results to load. In addition, the results shown real-time are anchored to the object that you are looking as you move the camera’s viewfinder.
Watch: Samsung Galaxy S9+ Video Review
Google says it uses both on-device machine learning and Cloud TPUs with billion of words, phrases and placed identified in a “split second.” Google Lens update will roll out in the coming months and will be integrated to native camera on Google Pixel and devices from LGE, Motorola, Xiaomi, Sony Mobile, HMD Global, Transsion, TCL, OnePlus, BQ, and Asus.