On May 17 at Shoreline Amphitheatre, CA, Sundar Pichai announced a mind-blowing, literaly, feature-tech at the keynote that gave goosebumps and reminded the old souls about the Google Goggles from way back in 2010. Well let me make this up, if you want to see where Google is headed, look through their new Google Lens. Well Yeah!
Well untying the knot, Lens is not a hardware upgrade rather its more of a behind-the-scenes workflow. It gets embedded with your pre-existing Android! In a single sentence, this can recognise text and objects from a picture or camera. It analyses and contextualises what it sees in real time and shares that info quickly.
It sounds pretty dry on paper, but Google uses the terrific trio for its help. And it’s not Ironman, Hulk and Cap. America. Google supercharges and utilises the tech of “Machine Learning,” “Vision-Based Computing” and “Artificial Intelligence” to describe it to its fullest!
This feature can be ignited by a simple hard-click on the home button that starts your Google Assistant. And then it’s like a walk in the park. Let Google do the tough work of getting the image, matching it with all the images they have with them and then finally letting you know the answer! It’s that simple.
Isn’t that a great feature-tech? I know right. For more info check Google Developers’s website. Thanks for sticking with us!
Stick with us a little more for more updates and subscribe and allow notifications to get updated right at the comfort of your couch! Cheers.