Now You Can Google In Real Life!

Now You Can Google In Real Life!
Have this post read to you in a natural voice:

During the recently concluded Google I/O conference for 2017 we got a glimpse of the future of search, and it is as exciting as anyone could have hoped for.

To get an idea of what has everyone hyped; try to answer this easy question: What do you know about the retail store directly opposite you across the street?

Google Lens, which was unveiled on-stage during the conference, promises to change the way anyone would set about answering that question.

Here are a few things that this neat new feature can do:

  • Provide reviews and other information about retail stores and restaurants just by you flashing your camera over the physical location.
  • Inform you on the flower species in front of you by viewing it through your phone camera.
  • Read a WIFI password through your phone camera and automatically log you into the network.

You smart phone is about to get a lot more powerful.

But even more interesting is what this new feature tells us Google as an “AI first” company. "All of Google was built because we started understanding text and web pages. So the fact that computers can understand images and videos has profound implications for our core mission," explained CEO, Sundar Pichai during his introduction of Lens.

The company plans to integrate Lens with Photos and their AI software, Assistant.

Through the former, Lens is set to use machine learning to analyze our photos better than ever before to perform a series of personalized tasks including: recognizing the people in these photos, editing, and prompting you to send those photos to the right people. It also invites others to send pictures of you, to you.

Lens effectively brings Google search to the physical world via the camera of your phone; and if the company’s plan to rethink all of their products to reflect the power and depth of machine learning is anything to go by, then it should be safe to imagine a future where to google something has very little to do with typing terms into a search bar.

Connected everywhere - literally.

Much more than another lifeless gimmick thrown into the battle for supremacy that is forever raging among tech giants; this new feature will actually be connected to you-its user-across your different devices. That way it can learn about you and your habits along the way and customize its results accordingly.

The amalgamation of AI and machine learning is not exactly new. If you are like 99.9% of the human population that logs online, it’s likely you have already experienced the results of this marriage in the digital space before. Among other things, it’s the reason why you only see adverts relating to your most recent searches online.

Usage in the Real World.

However when thrust into the real world, this coming together of AI and machine learning becomes even more powerful, given that in everyday life, two people looking at the same object may be after two very different things.

For example, imagine an event planner and a student majoring in Botany at a nearby university both looking at the same flower through their camera phone. While the former may receive results on how to use this flower to great effect in a bouquet or an event, the Botany major may receive the most relevant scientific information such as the scientific name of that same flower.

One object, many possible results. This is the future of search, and it promises to be an exciting one.