As the world’s most popular search engine celebrated its 15th anniversary in September 2013, it revealed a new search algorithm named Hummingbird. According to the head of search at Google, Amit Singhal, Hummingbird represents the most dramatic change to Google search for over a decade. Google has been reluctant to disclose any specifics of the changes, but provided the following statement:
Hummingbird pays more attention to each word in the query, ensuring the whole query is taken into account – so if a resulting page is a bit less strong in general, but it’s the most relevant to your search terms, that’s the result you’ll get ”¦ And if there are plenty of relevant matches to your search terms, Hummingbird does a better job picking the strongest page for you.
Understanding search algorithms
With 634 million websites (as at December 2012) any search engine has its work cut out. How do they decide on the most relevant pages from the endless possibilities for any particular search query? Each search engine has a different method which uses a combination of various techniques. It is this complex method which is known as an algorithm. Google – which continues to dominate the search engine market, attracting 70 per cent of worldwide searches (during September 2013) – is constantly tinkering with its algorithm and occasionally launches minor and major updates which are tracked by webmasters and search engine optimisation (SEO) professionals. Although the company does occasionally publicise its new algorithm updates and explains their purpose, it rarely reveals much about exactly how it works.
At this point it’s important to make the distinction between paid, direct, referral and organic traffic to your website. Paid traffic is made up of visitors who arrive at your website due to paid-for adverts (such as Google Adwords). Direct traffic is, as its name suggests, made up of visitors who go directly to your site, possibly by typing the URL into the browser’s address bar or clicking a bookmark. Referrals are visits which occur when someone clicks on a link from another website to your site. Organic traffic is generated by web users typing in their query into a search engine and clicking on one of the results which leads them to your site; this is where algorithms come into play.
So, depending on the proportion of visits which you receive from the various sources (organic, paid, direct and referral), organic traffic could be crucial to your business. Understanding some of the basics of search algorithms is important if you want your website to show up in the search results and consequently capture some of this organic traffic.
Trying to find the exact formula for Google’s search algorithm is next to impossible. However, the company does release some guidelines on writing website content and there is a powerful community of search engine experts who are constantly using trial and error techniques to find patterns which may explain some of the hidden elements of the algorithm.
Here are a few of the basics:
Content is king
The key focus comes down to page content. Ultimately, the search results should be relevant to the query. Although keywords are important, Google is very sophisticated and can tell if a website is over-using keywords in an effort to capture more traffic (a practice known as keyword stuffing); it penalises websites which are obviously trying to manipulate results via content.
Inbound links from reputable websites can help to establish a website’s authority and improve its rankings. However, the opposite effect can result from links on poor quality sites such as the now defunct “link farms” whose purpose was to provide links, usually for a fee. Social media now also contributes to reputation score.
User experience and site performance
A website may have decent content, but if it’s slow or difficult to navigate this can lead to users simply going elsewhere, thereby increasing the bounce rate. On average, people will only wait for about 3 seconds for a web page to load before they decide to abandon their attempt to enter the site. Websites which consistently have problems with performance or a high bounce rate will be marked down.
When it was launched in 1997, the main distinguishing feature of the Google search (when compared with other search engines of its time, such as AltaVista and Lycos) was the PageRank algorithm which calculated reputation according to inbound links. This unique function was perhaps responsible for the initial success of Google and its surge in popularity. Since then, there have been countless changes.
It’s estimated that Google updates its algorithms 500-600 times every single year, with the vast majority never being announced to the public. Perhaps the most well known algorithm updates have been Panda and Penguin (with several versions of each).
First released in February 2011, Panda aimed to demote the ranking of “low quality” sites (rated on many factors) such as those with badly written content or excessive adverts.
Penguin launched in April 2012 and aimed to penalise sites which engage in link schemes as a way of artificially increasing their rank. Although now over a year old, the effects of Penguin are still being felt and some webmasters are still trying to remove all kinds of inbound links in an effort to avoid the possibility of being penalised (although this could actually end up having a detrimental effect on rankings).
Why do they keep changing things?
Ultimately Google needs to perform search well. This is its core product and, if users can’t find what they are looking for by Googling it, the company will fail. The top search results for any particular phrase need to be accurate and provide relevant quality web pages.
Google vs SEO
Search engines have always faced the problem of unfair manipulation. Examples are keyword stuffing (where keywords are placed into web pages many times to improve the content score) and link farms (websites which will link to any website which pays them to improve their reputation score). Over the years, many rogue SEO companies have employed these and other methods in an effort to improve rankings for their clients, with Google having to fight back by changing its algorithm to recognise and prevent these techniques through blacklisting.
It’s only business
Google ultimately exists to make profits for its shareholders. Most of its revenue derives from AdWords (which determines adverts shown at the top, bottom and right hand side of search results). So there is a business incentive in making it difficult for companies to rely solely on the organic search results. Shaking up the rankings by changing its algorithm will undoubtedly have the effect of forcing many businesses which rely on internet customers to pay for AdWords listings when they lose their organic rank.
What does Hummingbird do?
Hummingbird was rolled out at some point in August 2013, though it was only announced a month later.
Google has revealed little about exactly what Hummingbird has actually changed. One of the primary known features is its focus on semantic search and away from keyword search. If someone asks Google a question, Hummingbird is designed to determine the intent of the question rather than just the keywords. For example, in the search query “where can I buy a Samsung Galaxy phone”, rather than listing the best results relating to this brand of phone, it will take into account the “where can I buy” part of the question before determining which results to show first.
A reason for the shift towards semantic search is that people are increasingly asking Google questions in natural language – particularly with the rise of voice search on smartphones. Conversational search – where you can follow up a question with further related questions – is another feature recently rolled-out by Google (as part of its Chrome browser) which gets us closer than ever to a form of artificial intelligence. This AI aspect is what perhaps really defines Hummingbird, moving us to the next stage of information retrieval.
The challenge for SEO
The methods of improving organic website ranking are constantly changing to adapt to the latest algorithm updates. Whereas once having a substantial number of inbound links targeted on relevant keywords was enough to ensure a decent rank and bring visitors to your website, you now have to make sure inbound links come from “quality” sites and keywords are not repeated too many times. Social media have also become more important.
There are still too many questions surrounding the way Hummingbird actually works to be able to provide any guidance as to how SEO techniques will need to be effectively adapted. Webmasters still need to take account of Panda and Penguin, but the focus of Hummingbird on semantic search means that questions related to keywords – the how, what, why, where and when – are more important than ever before. So ensuring that your web pages provide answers to specific questions could be the way forward. Alternatively, perhaps all this chopping and changing is just another way of Google trying to persuade you to pay for AdWords!
Alex Heshmaty is a legal technology specialist, web marketing consultant and freelance writer. He runs Legal Techie, a niche company based in Bristol, providing copywriting, SEO and web marketing services for the legal publishing and technology sectors.