What are we expecting and how can you prepare?

There is a complex paradigm in the Search Engine Optimisation world. Each and every year we know Google and other search engines are going to update, but we have no idea how. Google keep their updates close to their chest.

This being said, where it is impossible to be sure, there are certain things we can estimate with a little bit of futurism research. In order to do this though we need to look at what Google have updated before and what they are most likely to update in the future.

General Search Updates

It is a well known fact that Google update their Search algorithm almost daily. Each and every week there are five or six changes and tweaks to how Google Search finds web pages. These are usually unannounced, but they also rarely pose a threat to search rankings. Why? The answer is simple. It is a question of differing SEO techniques.

Black Hat SEO is a massive problem in the industry. These are people who do tricks that will help a website in the short term to achieve rankings, only to have it massively penalised in the future. In 2015 alone I can think of numerous times I have had to undo the work of Black Hat SEO. I can think of numerous occasions where I have even had to expose them to clients. 

The only way around this is to be completely transparent when it comes to SEO, and the best thing to do is actually to forget the majority of the algorithms exist. This means that content needs to be good. It needs to be very good. The trick is to keep asking one question: Is this the best I can make it?

To make it clearer why minor search updates rarely change the SERP (Search Engine Results Page) rankings all that much, those who are ranking tend to have great content already. They do not employ any Black Hat techniques. Let's take an example:

Google Minor Search Updates

Taking "Bonobo Monkey" as an example search term there is very little there that could be considered thin content. There is nothing there that has not been written with information at its heart. Each piece of content is set out to educate the audience in terms of what the bonobo monkey is and how it acts. 

This makes the odds of a minor update messing with the search ranking of the bonobo monkey incredibly rare. So much so that Borel's Law may even consider it impossible. The quality of each article is too good.

As a law in SEO - good content means good results. Bad content means bad results. Good SEO means good results. Bad SEO means bad results. Black Hat SEO means good results and then horrifically bad results.

Google Search Updates

Now comes the interesting part of this article (no offence to the bonobo monkey): What do we predict the major updates will be next year?

In order to make an accurate prediction we need to look at current trends in SEO and the Internet as a whole. We also need to look at past progression and what was uploaded in the past.

According to the SEO masterminds that make up Moz, Google triggered five different major SEO updates this year. Most of these we reported on in Underhood. These were an unnamed and unconfirmed update in February, Mobilegeddon in April, a Quality Update in May, Panda 4.2 in July, and RankBrain in October. 

The most prominent of these was probably RankBrain. It didn't directly affect Search so why does it matter? The answer is indirect but RankBrain could be one of the most important things Google have developed in a long time.

"RankBrain uses artificial intelligence to embed vast amounts of written language into mathematical entities — called vectors — that the computer can understand. If RankBrain sees a word or phrase it isn’t familiar with, the machine can make a guess as to what words or phrases might have a similar meaning and filter the result accordingly, making it more effective at handling never-before-seen search queries."

RankBrain is a form of AI (a very rudimentary form) used to determine search queries and interpret them in a new way. The website Bloomberg has an interview with Googler Greg Corrado, one which explores RankBrain in some detail. In it, and from a quote also picked out by Search Engine Land, This can be seen in the orange text on the right hand side of the page.

What this means is that Google are handing over a large percentage of their Search to an AI entity and this raises some interesting questions. As Hummingbird, for instance, is a semantic search algorithm, does this mean that it will be made redundant? Since Panda looks at content quality, is this also now lying in the gutter? In my estimate - no and no. 

RankBrain appears to be a new way of demonstrating some aspects of semantic search, something which Hummingbird is known for, and sensing the context surrounding a topic. Hummingbird will still form the basis of RankBrain. It is not a simple replacement but an add on like a DLC or an expansion pack. 

It is always important to remember that a new addition does not necessarily eradicate the old. It simply builds on and improves. There is a reason RankBrain has not simply been renamed a Hummingbird update. 

This means that we can almost certainly expect a RankBrain update and a Panda update at some point in the new year. We may even see a Hummingbird one. How these will change the SERP is unknown.

How can a website be prepared? Well, much like how 2015 was the year for mobiles and mobile optimisation, I am predicting 2016 will be the year of context. Keep everything relevant. Develop content plans, structures, and balance out blogs. Keywords will become less important over time, but structure will still be a must. This is not only the structure of the physical on-page content but of the Information Architecture as well.

Think through your content before you post. If it is relevant to your audience then go ahead and send it live. If not then maybe hold back on the post button for now.

The other thing we may see this year is a Pigeon update, Google's local Search algorithm. Over the past year we have seen Pigeon get better and better at finding local results. It does, however, still have some holes and it is easy to see these being patched in the new year. Google have already made a load of minor amends, but I can't help shaking the feeling that a major update is imminent. If you have not already you will want to ensure that the Schema and marked up data are fulfilled in Webmaster tools to ensure Pigeon can find your site.

Are Metatags Dead?

Earlier this year I was asked if Metatags were dead or not. The answer is not a simple yes or no. Instead it is more of a grey area. Are Metatags dead? Kind of. Maybe.

Metatags are the data put behind articles and posts allowing the subject matter to be emphasised. In the olden days, Metatags were used to emphasise keywords, however, as mentioned before these are falling out of kilter. Metatags are also falling out of use.

What I would suggest is not to stop using them yet (although there are many SEOs who would disagree with me) for the very reason that they are useful for your own onsite search. There is no doubt that your website search is worse than Google's. Don't take offence at that, Google have the best Search algorithm in the world, and the majority of Search bars use Metatags as a marker for knowing what a page is about. This is especially true in older sites. Metatags can improve product searches on ecommerce sites especially.

So, no. Don't stop using Metatags yet. 

Facebook Search Updates

"What? Facebook? Search?"

Yes. Facebook Search. It is a thing and it is getting better all of the time.

Facebook have always had a Search bar, however, only recently have they been taking their Alexa Rank #1 seriously. Facebook is the most visited website in the world, even more so than Google, and so recently they have put a lot of development time into their Search facility. 

A few years ago Facebook was only considered to be a Social Networking tool and, due to that, the Search function only needed to cover finding people. Then, when groups were introduced it needed to find those as well. Slowly but surely it has developed a wider function. Now it can search through the entire history of social posts on the site, and it is getting better at doing it all the time.  

Now, don't get me wrong, as a Search and Content specialist I am not losing sleep about Facebook Search just yet. The system still appears to have flaws in how accurate it is. Searching "CAB Studios" for instance does not bring up CAB Studios but instead brings up a few posts CAB was mentioned in back in 2014. Now, granted, that is way out, but it still found something and rumour has it Facebook are putting more time and effort into their Search function everyday. 

One important thing to note though is that they only crawl their own site for Search results. They will not find websites. So what does this mean in terms of Search and sites. It means that Facebook should be integrated into sites to allow sharing of appropriate pages. Although Facebook won't go out and read a website they will read posts including links to said website. This means content needs to be sharable across channels. Get it onto Facebook and then it can be found by Facebook Search. As the old Napoleon Hill saying goes: "You must get involved to have impact. No one is improved by the won-lost record of the referee."


2016 promises to be an interesting year with all kinds of opportunities ahead.

Always remember though, you need to remain true to SEO law. Don't be tempted to go Black Hat for even a moment as it will hurt in the long run. Stay honest and true and the rest will take care of itself.

For more information on SEO and how you can improve your website please click on the button below.

Start -Your -Journey