In all the years I’ve been promoting websites (17 so far and counting), one of the things I get asked most often is to get a particular site to “the top of Google”. Now, over the years there have been various ideas as to what the “top” of Google actually is – with Top Ten, Top Five and Top Three often being regarded as a very good runner up to that coveted Number 1 slot. …
One of the fundamental tenets of running a successful PPC campaign is to try and improve Click Through Rate (CTR) – that is, if a higher CTR is one of your Key Performance Indicators (KPIs), of course.
So it was interesting to review a split test pair of ads recently where there was a clear winner in terms of higher CTR – 7.1% versus 4.3% – which would ordinarily have resulted in the ad with the lower CTR being Paused and replaced with a new one, in a process of “beat the control”.
However, this particular pair of ads had been set up purely to measure the success of different landing pages, and both featured exactly the same text, capitalisation, Display URL etc.
So what was it about one of the ads that was attracting such a different CTR than the other? Interestingly, I don’t actually know.
But it certainly highlights an issue that is often overlooked when analysing PPC data – sometimes, things just happen without any specific explanation being able to cover it adequately. I’d certainly suggest that these random anomalies are few and far between, but they are certainly there and can serve to skew results in specific aspects of a PPC campaign, without being able to be identified and altered satisfactorily.
What needs to happen, of course, is a longer term view being taken. Which is exactly what happened with the pair of ads reffered to above, as over time, their respective CTR’s got closer together and have stayed pretty similar since then.
Shows the advantage of a broader perspective, rather than jumping on every minute detail, when it comes to focusing on what’s really important – achieving your business goals, rather than simply improving irrelevant stats.
I was chatting with a long term client the other day and we went through my own history with promoting websites. He remarked that my own involvement must somewhat mirror the overall history of the discipline, making “my history” almost equivalent to “the history” of search engine marketing!
Whilst I wouldn’t necessarily go as far as that, I do think he had a point. There aren’t many people in the UK who’ve concentrated on the same field for as long as I have; and while I think about it, there really won’t be that many people in the whole world! A fairly sobering thought in one way, but also quite a pleasing one in another.
So I’ve decided to run through what I’ve done in the search marketing area since I started all those years ago, by way of outlining how things have changed and yet in some respects remained the same.
1998 – Internet Evangelist
When I first started out, the majority of my time was spent in “evangelical” mode, trying to actually convince business owners that the internet was a worthwhile thing for them to be involved with. Seems ridiculous now, but I remember being told on many occasions that people were going to wait and see what happened before they committed to getting involved with the world wide web.
My own focus in the early days was on building websites – a natural consequence of businesses not yet having embraced the medium as being a useful sales and marketing tool – with several local businesses “benefitting” from my design expertise. I’ve put the word “benefitting” in speech marks, as my design skills left a little to be desired.
I actually started off using Microsoft FrontPage to build sites – a product which has been discontinued for almost 10 years now! My recollection of FrontPage 98 is that it was an extremely unwieldy piece of software that necessitated having 2 different screens open in order to generated anything resembling a web page. There was the FrontPage Editor element, which allowed you to generate code in what they laughably described as a WYSIWYG manner (What You See Is What You Get); which had to be linked up to the separate FrontPage Explorer – which was some kind of file management system. I never quite got my head around the reason why there had to be this separation, but I guess it was still the fairly early days when it came to web site development.
There was also a free version of FrontPage that was available with limited functionality. I did use this for the odd alteration to a website due to it being easier to use – as I remember, it didn’t have the clumsy double interface system of the full software.
Suffice to say, though, that my early experiments with web design were based on my having a sales and marketing background, rather than one in design. Whilst I stand by this even today – ie that websites are a sales tool first and foremost and should be built for this purpose, rather than simply to “look nice” – the element that was sadly lacking in my earliest designs was any sense that they had been put together by someone with a “good eye”.
I had, however, determined that any business that had a web presence would need some means of their potential customers finding them. To that end, each of the sites I originally built had been designed with search engines in mind. I certainly wasn’t using the term “search engine optimisation” in those days, having decided that “web promotion” was a conveniently descriptive term for what later became known as SEO.
Interestingly, a little while after I embarked on my internet marketing career, two PHD students at Stanford University incorporated a business that they decided to call “Google”. Stick around for the remainder of this post and you just may find I discuss that particular organisation again at some point…
AltaVista – King of the Search Engines
You may be familiar with the famous poem, Ozymandias, by Shelley? Essentially, the theme of the poem (written in the style of a sonnet) is of the way history treats great rulers and the empires they oversee.
The most famous lines from the poem, often quoted, are:
‘My name is Ozymandias, king of kings:
Look on my works, ye Mighty, and despair!’
In the poem, these words are inscribed on a statue to the great emperor – contrasting how he believed himself to be extraordinarily important in the world, yet the modern day traveller who comes across the statue has never even heard of him.
It may well be that AltaVista didn’t consider itself to be the “king of kings” when it came to search – but my recollection is that it was certainly the main one we were all trying to get our sites listed with in the late 90s.
Actually, when I first looked into how to get my sites listed in the main search engines for relevant terms, there were loads of them all trying to compete in the space. Nobody had quite worked out how to capitalise on the idea of search in terms of making big money, but they all knew they wanted to get a bigger piece of the pie for whenever the inevitable monetization light bulb went off.
Some of the search engines I was dealing with at the time that you may not be familiar with, or if you did know them have almost certainly forgotten about, were:
But the main one in those days was undoubtedly AltaVista.
Due to the number of different search engines that were around – and the possibility that any one of them could eventually come out on top as being the “go to” search engine of choice for web surfers* – my job was complicated by the fact that there appeared to be different rules for each one when it came to achieving good rankings.
* surfing the net is a phrase that has fallen out of fashion, but was all the rage in those days!
This was when I first learned about a phrase that people with even a passing interest in SEO have come across – “metadata”, a literal translation for which is “data about data”. If you imagine a library (which is essentially what a search engine is, though its contents are websites rather than physical books), there will need to be a system for identifying where the books are located in order that people can find them easily.
Books will, therefore, have descriptions on their back covers, which outline what’s contained within the pages. A table of contents and index section further assist with locating specific items of information within the book itself. These elements – the description, table of contents and index – are examples of metadata, as they are literally data that is about other data. (In this case, for example, the table of contents providing data – information – about where to find the rest of the data – information – in the book’s pages).
The basic elements of metadata from a web page perspective – Title tag, Description tag, Keywords tag – have actually remained the same in terms of what their function is. My job was to try and work out which search engine required more emphasis on which of these elements in order to try and gain a higher ranking in that particular engine’s listings.
Obviously, as time has gone on, the emphasis on the importance of metadata has shifted. The Keywords tag, for example, is almost entirely worthless nowadays – having started out as being the “secret ingredient” that could help a site rise to the top.
The Title tag is still an important factor – though not as much as it used to be – and the Description tag has been in and out of favour throughout the last 17 years, but now has found a pretty stable value for itself as a kind of default snippet of information that can be used in both search results and social media references.
And there are other elements that have had an influence on search positioning over the years, including some of the more recognised ones such as heading tags (eg <h1>, <h2>), and alt tags (the information included to describe the content of an image for those who either can’t see it or are browsing with images turned off).
But ultimately, the fundamental issue that has dominated search engine optimisation since even the days of AltaVista is that success is determined by the visible content of a web page. (I’ll be returning to the complementary issue of link building in a future post).
Monetizing Search with Paid Listings
The history of Paid Search is one I’m not going to go into in great depth, save that I was familiar with the original Paid Search engine, GoTo.com, when it launched in 1998. It subsequently morphed into Overture.com, which was eventually bought by Yahoo.
While we’re on the subject, Yahoo was a slightly different beast from the other search engines, in that it started out as a Directory. There was an editorial review procedure before your site could become listed – something that site owners eventually had to pay for, which was Yahoo’s original attempt at generating revenue from its service.
(Certain free competitor directories attempted to knock Yahoo of its Number 1 Directory perch – eg DMOZ, a service provided by the Open Directory Project, which was taken into the ownership of AOL after it had bought Netscape).
By far the biggest game changer, though, was the introduction of Google AdWords (told you I’d probably get round to mentioning them again) – which went on to become the massive moneyspinner that allows Google to enjoy the kind of multi billion dollar revenues it has today.
The earliest iteration of AdWords that I used was on a pay per impression model and one where you could pay to be in a particular slot on the page – obviously everyone trying to be at the top – and stay there for a month for a specific sum. This worked out to be a bit of a bargain for some people – especially in the early days of the “online Viagra” markets, as paying for that slot was a lot cheaper than having to pay each time someone clicked an ad. (I should point out that, whilst I was involved in plenty of these type of sites for promotional activities, the law eventually changed and the online pharmacy industry became properly regulated around 10 years ago).
The big game changing factor for Google was its adoption of Overture’s basic concept of Pay Per Click. In the Overture model, anyone who wished to bid higher than the next highest bidder would end up at the top of the listings – a pure auction system.
What Google did, though, was to introduce the Ad Rank formula, which essentially rewarded the better performing ads with higher positions on the page, without those advertisers having to pay more than they wished to. (ie if your ad got more clicks, you would rise up the page and still not have to pay more than your competitors who had a lower Click Through Rate).
So as well as SEO, PPC then became a very big part of the search marketing world from around 2003 onwards. Indeed, nowadays, with approximately 50% of all searches being performed on mobile phones, AdWords is increasingly important, as the organic (no-paid) listings are unlikely to even be seen on a mobile phone search results page, with the top listings all being AdWords.
I’ve been involved with the spending of hundreds of thousands of pounds on paid search over the years – perhaps most notably, when I ran the UK AdWords campaign for the launch of Microsoft Office 2007 – and have seen it develop from a relatively simple system into the enormously complex and sophisticated range of different services on offer in 2015. (Facebook and Twitter, for instance – still the new kids on the block in internet terms – both offer a PPC advertising service, with Facebook in particular providing a very targetable range of demographic settings).
Throughout that time, I’ve come to respect Pay Per Click advertising as the most sophisticated and controllable form of advertising yet devised. (A far cry from the days when advertisers would sit in a car near their billboard poster, counting the number of people who went past in order to determine how many “eyeballs” their advert was receiving!)
Red Herrings and Black Hats
Of course, with such a long history in search positioning, I’ve come across my fair share of “black hat” methods of trying to fool the search engines. (These methods supposedly get their name from the old TV westerns, where the “baddie” would usually be the one in the black hat, the good guy wearing a white hat).
And there have been plenty of “fool’s gold” magic bullets on offer over the years, whether they be doorway pages, random content generators, run of site links, article links, keyword stuffing etc etc.
But despite all the secret techniques, the solid foundation of what I was doing for SEO back in 1998 – solid metadata with quality visible content – is pretty much what Google still wants to see, which is why my clients are still being rewarded with good rankings and search traffic.