Google Lens is moving into the camera app and gaining new features

Google introduced a new version of Google Lens in beta, the company announced today at its annual developer conference. The new version will be compiled within the camera application, instead of Google Photos, and will be released in the coming weeks.

As part of the latest update, Google Lens will be integrated into ten different native Android devices, including Google Pixel 2 and LG 7 ThinQ devices, so it is not necessary to open a separate application. There will also be a real-time search engine that will analyze what your camera sees, even before clicking. If you point your camera at a poster of a musician, Lens can also start playing a music video.

Open any camera application and Google Lens will tell you what's in the image. The image recognition tool can give users more information about things like books, buildings and works of art. How it works is that you take a picture and the tool processes the pixels through machine learning to provide more details and also provides relevant search tags.

Google takes retail sales more into account with this new Lens update: instead of simply identifying clothes, it will also provide you with shopping links at times, if you recognize the brand or style. Lens can also recognize words now, so you can copy and paste from the real world on your phone, which is similar to Google Translate Image.

This time there is also a cross between Google Lens and Maps, which adds AR to streetview and helps you navigate in real time.

Leave a Reply