Google will shortly be fluctuating the strech of Google Lens, its visible hunt interface. In a blog post, the company announced Lens would be integrated into the Google Assistant in the coming weeks. The underline is still disdainful to Pixel phones, but now it should be a lot easier to access.
Google Lens came out in beta on the Google Pixel 2, which launched last month. The service is fundamentally a revamp of Google Goggles—you take a picture of something, run it by Google’s mechanism prophesy algorithms, and Google will try to tell you what’s in the picture. Google says Lens can brand text, landmarks, and media covers, but those were all things Goggles could do years ago. We tried Lens on the Pixel 2 at launch, and while it was really a beta with a lot of problems, it spasmodic did something impressive, like noticing not just that a picture contained a dog, but also nailing the dog breed.
Google says Assistant formation will concede you to get “quick help with what you see.” This sounds like a big alleviation over the stream beta of Google Lens, which is only integrated into Google Photos. Doing any kind of recognition by the Photos app is really slow, given you have to open the camera app, aim it at something, take a picture, open the picture, and then run it by Lens. The new plcae of Lens will be a lot easier—you just open the Assistant and daub on the Lens idol in the bottom right corner.
Google says the Lens-in-Assistant formation will be coming to “Pixel phones set to English in the US, UK, Australia, Canada, India and Singapore over the coming weeks.”