Google Updates Status 2016

814

Google’s Search Algorithms are constantly being updated with changes, Moz has published a good summary of these changes.

googleThere are often small refinements to the way that Google deals with searches and decides how to rank the results. Major updates have changed the rankings significantly, up to 3~4% of searches. Here are some of Google’s recent major updates:

Panda

The purpose was to remove web sites from the listings that have poor quality. Also, false or duplicated content created by people who are trying to cheat the system (so called ‘Spam’). Google is continually refining these algorithms to drive spam levels even lower.

In the past, large numbers of writers would work in a ‘content farm’ to produce large amounts of written material. Work pieces from these content farms may have duplicated paragraphs which would be spotted by Panda. Therefore, it is important to avoid accusations of duplicated content, so check your web diagnostics for this.

We have noticed that duplicated pages can be reported if the web addresses are not set up properly, so-called ‘canonicalization’ problems. For instance, if http://www.mysite.com appears to be different from http://mysite.com it will appear to be duplicated content, but this issue can easily be fixed.

Google’s Panda update programme first rolled out in early 2011 and was continually refined and developed. The last significant change was Panda 4.2 in July 2015.  Google has now announced that Panda is part of the core algorithm.

Penguin

While Panda evaluated content, Penguin looks at links. Google has been using inbound links to check the popularity of web sites. Links from ‘important’ sites such as Government, national institutions or other high-rated sites will make your content appear to be more valuable. Contrariwise, Penguin will penalise sites with poor-quality or unnatural links. In the past people used automated systems to generate many backlinks of dubious quality,
such as ‘link farms’ and directories and these will be penalised.

There are tools to help you to carefully analyse your back-link profile. Make sure that you do not have a large number of sites linking with an exact match for your key target phrase, and don’t have one site generating lots of links to your site. You may have a competitor who has added bad links to your site (which is called ‘negative SEO’) This makes it doubly important to validate your inbound links. You can use the ‘disavow link’ tool in Google Webmaster Tools. Alternatively, can contact the link source to remove poor links, or if a whole linking domain is poor, then you can disavow the entire domain. Penguin version 1.0 was released in April 2012,  version 3.0 was effective from October 2014. Penguin now receives continuous updates.

Hummingbird

Google announced this update in late September 2013. Hummingbird was apparently the biggest change to Google’s search algorithm in twelve years, and it included Panda and Penguin. It will have a long-term impact on web-based searches.

Hummingbird is paying attention to the meaning and context of each search. Rather than just relying on a list of keywords, Hummingbird is trying to respond to the intent of the query, rather as we would person-to-person.

The implication is for better interpretation of longer questions, which may include more obscure key phrases. These are known as ‘long-tail’ key phrases, as statistically they get searched less frequently. However, they can bring a lot of traffic. Webpronews says “The goal is that pages matching the meaning do better, rather than pages matching just a few words”.

What is Google up to?

Google’s declared focus is on returning high-quality, relevant content in response to searches, and an enhanced user experience. Their ongoing strategy is to rely less on keywords. Google is restricting statistical information about key phrases in Google Analytics, so that it is more difficult to work out which keywords are most productive. All this means that new SEO methods are required.  We will rely more on quality and content and less on processes to promote keywords.

RankBrain

RankBrain is now used for ambiguous or unique searches that have not been submitted previously. According to Google, brand new queries make up to 15% of all searches a day, and as Search Engine Land pointed out, Google processes 3bn searches per day, which means that 450m queries per day are unique.

RankBrain now uses use machine learning to cope with the sheer demand. The search software engineers were asked to look at various pages and predict which pages would be ranked at the top of the Google results. The humans guessed correctly 70% of the time, RankBrain guessed correctly 80% of the time.

Webpronews says that it’s clear that keywords are becoming less and less important to search engine ranking success as Google gets smarter at working out what things mean, both on the query side of things and on the webpage side of things.

A strategy to optimise performance with regard to Google’s updates

  • Concentrate on high-quality, high-depth content which will have lots of related ‘long-tail’ phrases.
  • Don’t base your strategy on a limited set of key phrases; this will progressively become less competitive.
  • Keep your XML site map up to date. Make sure that Google (and other search engines) can find your content.
  • Also, have a clear site structure with a strong home page, with static links to your valuable pages. Ideally, these should be within 2 clicks of your home page.
  • Promote your content so that valuable and relevant web sites link back to yours. Audit your backlinks, and remove or ‘disavow’ any poor or negative links.
  • Promote your site on social media, such as Google+ (which pays dividends with Google), Facebook, Twitter and LinkedIn. This generates good backlinks and also gives an alternative platform for searches and user interaction.
  • Also consider visual media on Youtube, Pinterest and Instagram.

In conclusion, here is a quote from Searchmetrics.com: “High quality, long-form content pieces that cover a topic in-depth are the winners in many areas. But the sheer amount of content is not decisive for rankings, rather the question of whether the content is relevant and fulfils the user intention.”