Why You Should Be Optimising your SEO for 2027 and not 2017

Treat SEO As An Investment For Your Business
When running a business, you can’t be short-sighted and only aiming for a short-term gain. All the plans and strategies should be set to be able to overcome the unforeseen future, and that includes SEO. As it’s not something that can easily bloom and show results immediately, you should really be patient.

Share This Post!

Facebook
Twitter
LinkedIn
Telegram
WhatsApp
Email

It seems that the moment June or July hit, it becomes a metaphorical arms race for the first person to publish an article entitled ‘optimising your SEO for 2017’ (or whatever the next year is).

There appears to be a bit of a trend every year, in how early some content writers decide it is time to stop thinking about this year, and start planning for next year.

It’s almost as if they aim to impress you by showing how early they start planning, as if being the earliest planner is something awe-inspiring.

I suppose in that vein, a title such as ‘plan for 2027’ could very well be much the same, only worse…

For the Foreseeable Future Our Industry Will Be Dominated by Google

The real question here becomes ‘just how much of the future can we even see?’

One of my favorite bloggers ever is Tim Urban, and his article on the AI Revolution literally strung a chord with me, ok… figuratively.

The thing is, if you were to look back at 2006 and the state of the Internet and SEO you wouldn’t even recognise some of the advances that have been made in this space.

The technological and computing power of an update as advanced as Penguin 4.0 would have been a mere pipe dream to Google engineers 10 years ago, yet here we are. That’s not to even speak about RankBrain and the implications that come with that.

Even if the future will eventually play out to some form of ‘balance within the search engine force’ and another player rising that can rival Google, we have to assume one thing…

Any Google Competitor Will Be Finding Ways to Optimise the Same things: Only Better

Google built their fortune and prominence on simply being the best at delivering the content that people wanted or needed. Its a pretty simple premise.

In exchange for being the best method for people to get the answers they seek, people have been more than willing to swap huge stock piles of their personal data to Google.

The playoff seems fair to many people, insignificant to others, and there are plenty who don’t actually realise that this is what is happening.

It would seem the average person doesn’t care that much about their information being harvested and then used for a profit. Just as long as they are getting the most relevant and useful responses to their search queries.

The emphasis here, is relevant and useful.

If another company can find a better way to deliver the most useful and relevant content to users and can effectively reach a wider audience, they may just surpass Google. Who can honestly know what the next 10 years holds.

One thing however can be safely assumed…

Most of the Ways Google Filters and Ranks Results Make Sense

Consider something as basic as not ranking a site with thousands of automated tier one forum comment links, then we can see that it makes perfect sense.

Why would a user find content that has been promoted with simple automation more useful than content that people have actually read, reflected on and then decided to share?

Algorithm updates like Panda and Penguin can be dissected and analyzed endlessly. People can run tests, experiments and all sorts of assessments to try and find the exact mechanics and break down of what has now changed.

If you take a step back though and look at the larger picture, it becomes pretty clear that the main point is that rankings should not be awarded to someone based simply on the fact that they know how to game the system better than someone else.

The quality of someone’s content has very little to do with the methodology that the content writer then used to promote it. So the algorithms are designed to try and provide an equal footing for content that may be superior, but not as aggressively optimised.

Don’t get me wrong, its easy to judge Google and their consistent algorithm updates on the vein that it is simply about moving the cup frequently enough that SEO can never remain as stable as just using Adwords, but I for one don’t think this is their primary motivation.

Moreover, if we grasp the core of Google’s mission, we understand that…

There is a Whole World of Possibilities with how Search Engines May Rank Content in the Future

I was recently discussing dwell time with a new partner as we are looking to provide a makeover to one of their underperforming sites.

Dwell time is one of those metrics that many people speak about that is not completely confirmed as to whether it is actually a ranking metric or not.

As Joshua Hardwick points out in his recent Ahrefs post on Dwell Time, there is no official statement from Google on whether this is in fact a ranking signal or not. However, there is a great reason to believe that it either is, or will be in the future.

A forward thinking SEO will often find themselves in this situation.

At the moment, we don’t actually know if this is or isn’t a ranking signal, but we can see quite easily how it could be a good indicator of the quality of a piece of content.

The question becomes, should we optimise for dwell time now even though we don’t actually know if it is a signal, or should we stick to what can be confirmed?

My thoughts have always been…

Regardless of Confirmation, If We Know It Could Be Used as a Quality Signal, We Should Optimise for it

If you told an SEO ten years ago that Google would eventually introduce an update that can cross check the semantic relevance of keywords and whether they are related to the queried keyword, they may have laughed at you there and then.

The update in question is of course Hummingbird, and the way SEO’s adapted to this update was by using LSI Keywords.

Hummingbird wasn’t designed to become just another game of cat and mouse between an SEO and the algorithm, it was designed to be able to reward rich and relevant content with the rankings they deserve.

If you have written an article on something, and there are certain words that would generally be needed in the exploration of that topic, then it makes sense that Google would want to use the existence of these peripheral words as a way to spot check for quality.

People that were already providing rich and useful content on a topic were rewarded for having already been useful, whilst webmasters that had previously been keyword stuffing till their hearts content were taken down a peg or two.

In one scenario the writer was optimising for quality and human consumption, and in the other scenario the writer was optimising for how they knew the algorithm was programmed.

No Matter How the Algorithm Changes, The Signals that Indicate Authority and Trustworthiness Will Not Change

If you just remember one simple rule, you will always be fine.

That simple rule is that the algorithm is changing and adapting in an ongoing attempt to understand human beings better.

The more you think about how you can show a human being that you are trustworthy, or authoritative then the more you will end up showing a search engine in 2027 that you know what you are talking about and can be trusted.

Do you need assistance on optimizing your website?