19 Pennine View, Darton, Barnsley, S75 5AT
01226 236449

Past, Present and Future of SEO and Google Search

What tips can we take from the early days of Google, what are the keys to SEO success today, and how is the future of search going to change?

Past – the history of SEO and the early days of mainstream search

Search engines actually existed years before Google was even a thought in the minds of founders Larry Page and Sergey Brin.

However, they were intrinsically very simple, and usually ranked websites by how many times the search term appeared on any given page. If your page featured the search term the most, it’s likely that it would be number one or thereabouts.

Keywords Whilst this seems so simple to the point where it would cause chaos (with websites simply trying to out-number each other in keyword density), it’s worth remembering that e-commerce was still in its infancy during the early-to-mid 90s, so the intense competition to grab the top spot for key search terms was nowhere near as ferocious as it is today. For example, Amazon.com, the world’s largest online retailer, was only launched in 1994.

In short, the industry of SEO as we know it today was yet to be born. Search engines like Excite and Lycos were around, but no one had yet realised just how much of an important role search engines would play in the future of internet wealth and fortune.

Back to Google

Stanford University PhD students Larry and Sergey, upon assessing the clear flaws in the models of early search engines, started to think about an improved system which incorporated other factors to create a better search experience. After all, good search engine protocol was all about users being shown the most relevant sites for their given search terms, but with existing rankings being far too easy to manipulate, the Google founders tried to devise a system which was more effective in its prime objective and also less easy to abuse.

The result was less of a focus on page content and more of a focus on the relationship between websites on the internet. Firstly, Larry and Sergey theorised that the most useful webpages would be more discussed, shared and therefore “linked to” on other sites.

Their system could then analyse the number of links to assess the webpage’s usefulness (they theorised that the more links there are to it, the more useful it must be), and also analyse the content of the websites which have linked to the webpage to assess the relevancy and importance of those sites.

So it wasn’t just about the quantity of links, but also the quality of those links. A link was scored as better if it was from a website which in-turn had a lot of links to it with relevant content.

This became known as PageRank, and was a key technology in the early stages of the Google search engine. It is also still widely considered to be of some influence today, with many SEO services being based on generating “backlinks”. However, with many other factors now also being included in the calculation of a search result, the PageRank score alone now rarely correlates with search result positions (the largest PR score doesn’t always mean the 1st position in a search result).

Getting a backlink to your website from a low quality directory won't help your pagerank too much but getting a pr agency to get a great article published on a relevant or high authority site will.

PageRank – what this meant for SEO

The launch of Google and its PageRank technology coincided quite nicely with the start of the internet boom, also known as the “dot-com bubble”. The meteoric rise in internet businesses meant that budding entrepreneurs could make thousands, whilst an explosion in internet sector stock market trading meant that many shareholders of internet start-ups became millionaires overnight.

By this point, many online entrepreneurs started to realise the importance of search engines, so the race was on clamber to the top of the rankings - often through any means necessary. This is generally considered around the time when SEO as an industry was born, with the first documented use of the term “search engine optimisation” being made back in August 1997.

So what was SEO like back in the early days?

Primitive - although many of the key factors in SEO are still used today. Content is key to show on-page relevance with meta and title tags included, whilst building backlinks from authoritative sites is also important. Alongside this, there strong emphasis on “keyword density” (i.e. the number of occurrences of a search term in any number of words on a page). Whilst modern SEO still analyses the use of a keyword within the content of a page, it’s now far more complex than merely shooting for the highest possible keyword density.

white hat versus black hat SEOThe simple algorithms used by Google and other search engines also allowed more abusive forms of search engine manipulation to become more common. This became known as “black hat SEO”. Such techniques relied on tricking the automated search engine bots which land on websites and analyse their content. Whilst even modern SEO involves an element of manipulation to some degree, early black hat techniques were far more brazen. A simple bot would be fooled and give the webpage strong rankings, but a more in-depth analysis by Google would almost certainly result in the page being penalised in the search engine ranking results eventually.

Present – the modern day best SEO practices

It took a surprisingly long time for major search engines like Google to really start clamping down on simple manipulation techniques and black hat SEO. In fact, many techniques were successfully used for over a decade, and some techniques can still work to some degree today (although they’re far more likely to be caught sooner rather than later).

However, you can’t blame Google. Without human judgement, search engine providers need to write algorithms - mere calculations - to root out bad SEO practices whilst making sure that the most relevant and authoritative pages are correctly ranked without being accidentally penalised. It’s a tall order!

But now, search engine algorithms, especially from the likes of Google, are now more sophisticated than ever. However, it’s interesting to note that many of the key factors of on-site and off-site SEO share striking similarities to those used in the early days of search engines in the 90s. They’re simply assessed in a far more intelligent way. There are also a few brand new factors which have only come to fruition in the past few years.

What are the main features and best practices of modern SEO?

1). On-site content – making sure your website itself is optimised

Title tags, header tags and relevant content are still key factors in on-site SEO. After all, search engines are primarily about matching visitors who want certain content with the right types of content. Relevancy. That’s why Google still pays a lot of attention to the content of your websites.

However, it now goes far beyond the simple days of keyword density. Google and other search engines have advanced algorithms which can assess the raw quality of your content and how useful a visitor is likely to find it. There are literally hundreds of subtle cues which search engines now use to build up an overall ranking score for your site. Not only that, but Google also analyses factors such as domain age, page loading times, functionality, usability, URL relevancy and more.

The solution? Make sure the basics are in place (keywords in the title tag and URLs, keywords throughout the copy etc), but also give Google exactly what they want: informative, interesting, unique and valuable content which visitors who search for specific search terms will want to see. That in itself goes a long way.

 

2). Links – creating authoritative backlinks throughout the web

The simple PageRank score is now fairly obsolete in the grand scheme of things (a score of 10 out of 10 doesn’t mean that the website is going to beat every other to the top spot). However, backlinks are still an integral part of assessing a webpage’s importance and relevancy.

A key difference compared to the early days is how these backlinks are generated. Years back, people used to spend weeks submitting their website to high-PR directories and any other sites where live URLs could be placed. Website owners would take anything they could. Did this work? Certainly. Does it work today? Yes, but only up to a point.

Many SEO experts have stopped trying to manipulate backlink building in such a direct way and have instead worked with Google to build natural backlinks through the use of content. This involves creating valuable and informative content which is more likely to be shared on other websites and social networking platforms. If you create a killer article which becomes so popular that it gets shared thousands of times, you’ve suddenly generated thousands of backlinks for your website as well. This long-term approach to backlink building now tends to be the most common.

3). Social – factoring in Facebook and Twitter for SEO

Social networking is now intertwined in almost every website we use, and Google knows this. Although Matt Cutts, a senior figure in Google, stated that Facebook and Twitter signals aren’t factored into search results, it’s quite possible that Google and other search engines still analyse the data to pick up subtle cues which they can use. However, this is unlikely to be as simple as the number of Facebook “likes” or the number of Twitter followers.

Despite this, social networking can still play an important role in other ways. For example, incorporating social networking functionality in your website makes it much easier for visitors to instantly share your content (and thus build more exposure and possibly more backlinks).

Another key factor is personalised search, which Google now appears to have introduced to all Google users. This means that search results are now personalised based on a number of factors, including previous searches and information drawn from Google+ social networking accounts.

This creates a big headache for SEO specialists, as it means that search engine ranking results for a particular keyword are now not necessarily the same for everyone who searches for it. However, it also gives search engine optimisers another tool which they could harness to improve rankings.

4). Local SEO

Localised search results are now more popular than ever. Google understands that people don’t just use Google to find webpages of information, but also services and amenities local to them. That’s why, if you now search for a term like “pizza”, Google is likely to assess your IP, find out your rough location and then display a list of pizza restaurants and take-aways. These are in-turn ranked by things like location, reviews and other factors.

For many companies, getting good rankings in localised search results can now be far more lucrative than traditional rankings, especially when local search results are often displayed above other organic listings.

We ourselves target SEO Barnsley which is a popular search term as well as trying to rank for more national keywords such as affordable SEO.

Future SEO – what will the future of SEO be like?

It’s tricky to tell, but changes by Google and other search engines give us a glimpse into the direction that the giants of search may be taking in the years ahead.

1). Moving away from tactical SEO

Google has recently stopped people being able to analyse keyword data from visitors hitting their website through Google. The PageRank scoring system is also being updated less and less frequently. What this could tell us is that Google is now trying to move away from the traditional approach to SEO, so we may find the common SEO tactics (i.e. traditional on-site optimisation techniques and backlink building) becoming less effective over time.

2). Hummingbird

Google Hummingbird was a major change to Google’s search technology which was introduced in late 2013. Information about it is scarce, but one thing which has been noticed is that Google now has far more intelligence when it comes to understanding search queries instead of simply analysing each word and then finding the websites most relevant to those words. This could show that search engines will focus more heavily on understanding what searches actually mean in an attempt to present more relevant results.

3). Social media and personalised search

The direction Google and other search engines are taking suggests that social networking and personalised searches are going to become more intelligent as they grow in popularity. Many envisage a world where Google holds a considerable amount of data about your internet usage, social media habits and even your lifestyle to personalise searches more than ever before.

4). Semantic markup

This is in existence now (especially with the development of schema.org), but it’s likely to become far more important in future. In a nutshell, this provides a semantic markup language which allows almost every device to understand content and data. It’s likely that search engines, websites, apps,  social networking platforms, mobiles and many other devices will become intertwined with data being freely shared between them, so the correct semantic markup is key to ensure that all data and content can be understood by the devices.

5). Presenting information instead of web results

Google wants users to get the most relevant content, and sometimes they can achieve this by presenting it themselves instead of relying on other websites. That’s why Google now offers many contextual search results which now go beyond traditional organic searches. For example, local results to find nearby businesses and even direct answers to questions right at the top of the screen based on information Google has found on the net. Type in “when was Albert Einstein born?” and you get his date of birth displayed right at the top of the screen before any organic results are even shown. This is likely to become more and more popular in future to the point where many search questions and queries can be satisfied without the user ever leaving the Google platform.

So what does this mean for me and my SEO?

It’s tricky to say for certain, but it’s important to keep in mind two key facts about the future of SEO:

1). Google is only going to become more intelligent, so it’s better to work with it instead of trying to manipulate it.

2). Google will change the way searches happen, perhaps drastically. Organic web-based results will always have their place, but the very nature of effective SEO (i.e. SEO which is going to give you the strongest return on investment in-terms of sales) may become unrecognisable to what it is today. This isn’t just to do with Google’s algorithms, but also the nature in which information is being created, shared and accessed across the globe.

Comments are closed.