You might remember the plethora of different search engines which floated around the internet in the mid-nineties.
These search engines were pretty primitive by today’s standards. Since the web was a much smaller place back then, many ‘search engines’ were just Yellow Pages-style directories which were compiled by humans. Crawlers (aka spiders) arrived in 1993 – these would index the content of your site and follow any links you had to other sites so it could index them too.
It soon became clear to webmasters that ranking highly in search engines was very helpful for letting users find your site – and therefore very lucrative if you were offering products or services. Tricking those early crawlers wasn’t hard – there was a time when you could just repeat the keyword you were targeting on your homepage, over and over, and you’d end up in the number 1 slot.
A Wild Google Appeared
That all changed with the launch of Google in the late 90s. Google’s search engine technology was leaps and bounds ahead of the competition, thanks to several clever innovations in the way results were collated and ranked.
Google’s ‘PageRank’ algorithm worked on the logic that links demonstrated value. Generally speaking, people link to sites and pages which they find interesting or important – so a site with lots of external links from several different sources was likely to be more useful to internet users than a site with fewer or no external links.
Soon, ‘link-building’ became synonymous with SEO. Webmasters began trying to place links on any sites with a high PageRank. Email inboxes would be swamped with link requests.
Some sites teamed up to swap links, while others would offer link placements for a fee. Paid link networks sprang up, producing thousands of sites which were of no use to anyone except the link farm owners.
The Algorithm Fights Back
And manipulative linking wasn’t the only tactic that so-called ‘blackhat’ SEO practitioners were using to push their low-relevance sites to the top of search results. Keyword stuffing and misleading meta tags were just two of the underhand methods which were freely exploited in the early noughties.
In late 2003, Google implemented a major update to its search algorithm. Referred to as ‘Florida’, it was one of the first in a long line of major incremental updates to thwart spammers. Pretty much overnight, thousands of spammy keyword-stuffing sites disappeared from search results – but unfortunately, Florida also ended up removing a lot of genuine pages too.
Site owners were furious. Many suspected the algorithm update was just a stunt to make users turn to AdWords, Google’s paid search platform which had been relaunched the previous year.
Nevertheless, it was a huge wake-up call to business owners and marketers, and a giant indicator of how big SEO had become.
Even Today, SEO Is An Uneven Landscape
A lot has changed for Google search since then. PageRank and meta keywords became minor factors in the search giant’s list of over 200 different criteria for ranking results. Google Analytics allowed webmasters to track the performance and optimisation of their sites more intuitively. Universal search arrived, giving users results for webpages, images, video, shopping, places and more on one page. Bing joined Yahoo as Google’s closest competitors.
However, one thing seems to remain constant for Google – the cycle of controversy. SEO ‘experts’ ignore Google’s guidelines of creating great content to get ranked, resort to trying to manipulate Google’s algorithms, and then unleash their hatred on Google when their latest update puts a stop to their plans. You only have to look at the vitriol surrounding the infamous Panda and Penguin updates to see how this trend has continued into the present day.
If you want to win the SEO game, it seems you have to concentrate on being the result that users want to see at the top of their list.
Need help with your SEO? Get in touch today for a free SEO review for your website.