Named for its accuracy and speed, Google’s latest algorithm “Hummingbird” redefines organic search on the internet. Its two key features – conversational search and semantic search differentiate it from its predecessors making it the search engine of choice for hands free mobile devices of the present and future.
Although reintroduced in its new avatar, conversational search (or searching by asking Google a question as you would in a conversation, as opposed to typing your search phrase into the search box) really began in 2011 as voice search. Widely adopted on mobile devices for its obvious convenience, it lead to Google adopting semantic search as Hummingbird’s search logic in 2013.
Semantic search is the ability to search by understanding language or the meaning of spoken words based on context and the searcher’s intent. In semantic search, words alone are less important. They are symbols for what they convey and derive significance from their meaning. In choosing conversational search, Google somewhat pre-committed to semantic search.
Hummingbird searches differently from its predecessors that used keyword matching to find results regardless of their meaning or context. When we ask Google a question, Hummingbird tries to first figure out what we mean and what we might be looking for rather than simply “chase the phrase”. It then finds web pages whose content conceptually matches the meaning of our question. This increases the relevance of its search results.
In order to help the Hummingbird figure out meanings, Google has compiled a proprietary database called the “knowledge graph”. The knowledge graph contains “entities” which are keywords, synonyms, and their variations based on 15 years of Google’s own search history, and drawn from various other external sources such as the CIA World Factbook, Freebase and Wikipedia. It shows the relationships between millions of entities and how they are interrelated. It is a dynamic tool that evolves and improves as it learns from the web, drawing on the web’s “collective consciousness” so to speak.
Hummingbird accesses the knowledge graph to understand what our queries mean and to find conceptually and contextually matching results. If you were to ask Google a question such as “Where can I find a Ben and Jerry’s?”, Hummingbird would be able to figure out that “Ben and Jerry’s” is the name of a business that sells ice cream. It would then speak back to you, providing you with a list of all Ben and Jerry’s locations near you. This is a huge improvement over Google’s earlier search that would have shown you matches for “Ben”, “Jerry” and “Ben and Jerry’s”.
Now, if you were to ask Google “When are they open?” Hummingbird would understand based on your previous question that “they” refers to Ben and Jerry’s. It would show you the business hours for the locations it had pulled up earlier. Pretty cool, right? Hummingbird makes search easy by making a contextual connection between your first question and the next. It is Google’s best attempt yet to design an intelligent search that flows like a conversation.
Search engines as a rule want us to spend as little time searching on them as possible. To make search easier and quick , Google sometimes displays short answers containing just the bare facts. These answers or “knowledge cards” are based on the information contained in Google’s knowledge graph. These are displayed right on top of the search results page, often eliminating the need to scroll down the page or click on any of the links displayed further down.
If knowledge cards were to become the norm in the future, it might help Google improve search, but it would reduce organic traffic to websites and click-through rates (CTR) for paid search.
Would information cards become the nemesis of SEO in the future? Would Google be willing to sacrifice a huge chunk of its revenue currently derived from clicks on Adwords, in order to improve search? What are your thoughts? Let’s hear from you now.