Evolution of How We Use Google Search
September 29, 2022The evolution of Google Search speaks volumes of how efficiently we are able to search the internet for answers to our questions. Googling started out as a text search within a basic browser window. But the advent of machine learning and AI algorithms have made things more convenient and deadly accurate.
In addition to humanizing Google’s search engine, users can also use the camera or voice instead of their keyboard to search for what they want. The company is trying to create search experiences more like our minds and as multidimensional as we are.
Ideally, the company wants you to be able to find what you want by looking for a combination of images, sounds, text and speech. Users should be able to ask questions with fewer words or none at all, but Search will still be able to find what you want.
In 2017, Google introduced Lens, a tool that could tap into the company’s vast search knowledge by analyzing images captured on a smartphone camera. In certain instances, it can be extremely helpful when looking for a particular object, like a shirt, online. Apparently, about 8 billion questions are answered this way in a month.
Expanding on this, Google introduced multisearch, which uses images and context to find what you are looking for. For example, you could take a picture of a blue shirt and ask Search to find something similar but in green. This is how a human would ask another person a question and expect a concise and coherent answer.
Multisearch is still only available as a beta in the U.S., and there’s no word yet on its expansion to other countries. But like most of these things, they’ll eventually be available worldwide once most of the kinks have been worked out.
Google Translate happens to be one of the most helpful tools for travelers as well as people who are accustomed to purchasing foreign goods. Through the use of AI, the tool is able to translate text to a language the user understands – this is old news.
An evolution of this function is the system’s ability to recognize foreign words using a smartphone camera, then translate and overlay them on the original. This allows users to understand storefronts, menus, signs and many others.
As time goes by, Google Search will continue to evolve to a point where we’ll be able to interact with it as we would another human being. Admittedly, we are already halfway there, and the advancement in AI technology should allow us to reach that goal fairly quickly.