I’ve been promoting websites since 1998 – before we even used the term Search Engine Optimisation. Back then it was far from clear which of the multiple search engines would rise to enjoy the kind of dominance that the mighty Google now enjoys, with multiple sites competing for your searching time. (Lycos, Infoseek, AltaVista etc etc). …
A great article and interview in The Times, on Saturday 24th October 2015, with Miles Young – the Chief Executive of Ogilvy & Mather Worldwide.
In all the years I’ve been promoting websites (17 so far and counting), one of the things I get asked most often is to get a particular site to “the top of Google”. Now, over the years there have been various ideas as to what the “top” of Google actually is – with Top Ten, Top Five and Top Three often being regarded as a very good runner up to that coveted Number 1 slot. …
One of the fundamental tenets of running a successful PPC campaign is to try and improve Click Through Rate (CTR) – that is, if a higher CTR is one of your Key Performance Indicators (KPIs), of course.
So it was interesting to review a split test pair of ads recently where there was a clear winner in terms of higher CTR – 7.1% versus 4.3% – which would ordinarily have resulted in the ad with the lower CTR being Paused and replaced with a new one, in a process of “beat the control”.
However, this particular pair of ads had been set up purely to measure the success of different landing pages, and both featured exactly the same text, capitalisation, Display URL etc.
So what was it about one of the ads that was attracting such a different CTR than the other? Interestingly, I don’t actually know.
But it certainly highlights an issue that is often overlooked when analysing PPC data – sometimes, things just happen without any specific explanation being able to cover it adequately. I’d certainly suggest that these random anomalies are few and far between, but they are certainly there and can serve to skew results in specific aspects of a PPC campaign, without being able to be identified and altered satisfactorily.
What needs to happen, of course, is a longer term view being taken. Which is exactly what happened with the pair of ads reffered to above, as over time, their respective CTR’s got closer together and have stayed pretty similar since then.
Shows the advantage of a broader perspective, rather than jumping on every minute detail, when it comes to focusing on what’s really important – achieving your business goals, rather than simply improving irrelevant stats.
There are many things which can go into a successful SEO project – even in these days of content marketing and quality links. But overall, there are a few essential elements that I recommend for all sites in order for them to be search engine friendly.
The forgotten relation in SEO. Whilst description tags and the like are not as important as they were when I first started optimising sites – way back in the “dark ages” of the late 1990s – there are still benefits to be had from getting your metadata correct in terms of best practices for SEO. What this requires is:
– Title Tag
Still one of the most important elements in “on page” SEO – your title tag should reflect the keywords that are likely to be used to search for the content of the page. These words will be used as the heading for your listing in the Google search results pages.
– Description Tag
Not used for ranking purposes, but should appear underneath the heading in Google, as determined by the Title Tag mentioned above. The copy you use here can be very useful for convincing people to click through to your site, especially if it not only matches the keywords they might use, but also provides them a good reason to visit your site. For example, you can promote your USPs here to give people a compelling reason to click the listing.
2) Responsive Design
Google is very keen to see that your website will work well on any device (see this post on the Google Mobile Algorithm Update for more info). It may not be the case yet that your site will be penalised for not being responsive, but my bet is that this will start to happen before too long.
So you need to ensure that your site’s layout is adaptable to the device it is being viewed on – and not just from the point of view of it changing shape. You also need to ensure the site is navigable and user friendly, too, in order to keep people on the site.
A term that is not so much in favour nowadays compared to 10 years or so ago, stickiness refers to the capability of your site to keep people engaged whilst they’re visiting it. My recommendation is that you ensure each page has something of value on it that will keep people reading (or watching a video).
Google certainly doesn’t want to be sending its users to a site that only holds a visitor’s attention for a few seconds before they click back to try and find another, more suitable match for their search query. So you should populate your site with quality, useful content and keep those visitors (and Google) happy.
Whilst there are certainly lots of other things you can do – as previously mentioned – if you concentrate your efforts on the 3 issues listed above, you’ll already be doing pretty well in terms of SEO effectiveness.
Back in the day, people used to share links on the internet because they actually thought the site being linked to might provide some value to people who read it. (Crazy concept, eh?). People would go out of their way to build up Links pages and Resource pages that included all sorts of sites the site owner had found, that they believed to provide useful information.
Obviously, the fact that someone was visiting their site in the first place made the webmaster believe that person was probably interested in the subject matter of their site. For example, if a site was about shark fishing, the person who managed the site would be fairly likely to assume that anyone visiting was going to be interested in fishing for sharks. So they’d populate their Links page with sites that were likely to also prove of interest to anyone interested in this particular topic.
OK, so far so what?
Well, when the mighty Google came along (Google being one who must always be obeyed, when it comes to SEO), and decided that inbound links to a site would make up a substantial part of its fabled “algorithm”, the Links landscape changed forever – sort of…
Providing site links was now no longer the preserve of friendly website owners who were trying to be helpful to their site visitors, it now attracted a whole new breed of “link builders” whose sole purpose was to try and fool Google into thinking a particular site should be ranked higher simply down to the volume of links it had attracted from other sites.
A Quick Explanation of why Inbound Links Matter to Google
The 2 students who setup Google in the first place decided they’d incorporate one of the central tenets of academic credibility into their search engine’s algorithm. That element being citations from recognised authorities. (Academic papers always include references to other papers from which the main arguments have been taken – which gives the paper credibility and authority, as it shows the content is based on other people’s work and so gives it an air of reliability and veracity).
So Google incorporated a large element of measuring inbound links to a particular web page in order to determine whether it was deemed to be a “good” page about a particular topic. The theory being that these inbound links would serve as “citations” from the original page to the one being linked to – thus providing evidence of the quality of this page.
The basic principle enshrined in the algorithm in the early days of Google is the main reason why SEO experts focused so much of their time on link building. And it’s still something that you can capitalise on today – so long as you go about it the right way.
Here’s a rundown of the 3 best types of links you can generate for your site, that will be Google-safe and actually beneficial for traffic and sales:
Resource Page Links
Utilised in the very earliest days of the internet to help people find sites that are likely to be of interest to them, Resource page links are still a very valuable link to attain. When a site has been put together that focuses on a particular topic, it makes sense for their Resource page links to be related to that topic in some way. These are exactly the type of links that Google likes to see – contextual, relevant links that are given for the purpose of helping a site’s visitors, rather than simply to aid with another site’s rankings. (Though, of course, this rankings boost is a happy side effect of having this type of link to your site).
So where can you find these Resource pages?
A good start is to use Google itself. Searches such as:
“topic keyword” + “resource page”
“topic keyword” + “resources”
“topic keyword” + “links page”
“topic keyword” + “links”
Will provide plenty of sites that feature the kind of links page we’re after – each of which will be related to the “topic keyword”. (So obviously you should ensure the “topic keyword” you search for is relevant to the content of your site – the one you want to obtain a link for).
You should make a list of sites that appear suitable, checking each one to see whether you think it is of decent enough quality – something you should be able to determine simply from your gut instinct. ie if you find yourself thinking “this site is a bit rubbish”, it’s probably not one that you should try to get a link from.
Contact each of the sites on your list with an email that compliments the site owner on their useful resource, and also introducing them to the fact that your own site will probably be something their site visitors could benefit from seeing, so could they please update their Resources page with a link to your site? The ideal method for being successful in generating multiple links is to have developed some great content – my suggestion being a collection of great blog articles, at least one of which you can draw the site owner’s attention to – which is more likely to find favour when it comes to providing a link.
Resource Page Broken Links
An even more successful method for generating links is to point out to the site owner where a link they have on their Resources page is broken – ie because the site no longer exists or the URL has changed.
Informing them that they have a broken link which needs to be removed / updated, plus providing a new site they can link to, is generally going to find favour with most site owners, as it helps them out and keeps their site looking current.
The ultimate in Google safe links is to attract a link to your site simply because your content is so good that people genuinely want to link to it of their own accord. Anyone who’s tried any link building in the past may be sceptical of this approach, but it actually does work.
Site owners are keen to provide good content for their visitors, so being able to introduce them to something of real value – your quality content in the form of a blog post etc – is something they will be keen to do.
My experience here suggests that “topic keyword” bloggers are the most likely to want to link to your content, particularly if it’s complementary to something they have recently written about.
One of the inevitable consequences of the rise of the smart phone is the increasing frequency of local searches. It stands to reason that, if you’re looking for something particular by performing a search on your mobile phone, it’s quite likely to be because you want the thing you’re looking for to be nearby (at least, nearby to where you are currently).
As well as this aspect, it’s a well recognised phenomenon that there have been an increasing number of searches based on a local area over the last 5 or so years, even amongst those people using desktop or laptop machines to perform their search.
So, if your business can be seen to have a local element – ie the majority of your customers are likely to come from the local area – you really need to be optimising your site to take advantage. Here’s how I recommend you go about it:
1) Name Address Phone Number (NAP)
You need to ensure your company name (or the name you trade under if you’re eg self employed, which could, of course, be your actual name), physical address and the phone number of your organisation is displayed on your site.
Some people suggest you should have each of these details on every page of your site – for instance in a footer – and I would certainly go along with that, especially as it can also help in terms of enquiries if people have read to the end of a page and then see your contact phone number whilst they already may have the thought in their mind to contact you.
You can simply have the NAP info on your home page and Contact page if you wish, but I would recommend featuring it on every page if it fits in with your site design.
2) Google My Business
Google has an irritating habit of changing the name of its local business listings service on an ongoing basis (eg Google Places being a previous incarnation), but for now it seems to have settled on Google My Business as being the name it’s happy to use.
You should ensure that you have a Google My Business listing, and equally ensure that the Name Address and Phone number details recorded in it are identical to the ones you feature on your site.
Further populate this listing with info such as Opening Times and accepted methods of payment, as well as ensuring you are listed in a relevant category, to give Google as much info as you can that could help with your local search listing.
3) Don’t try to Fool Google
If your business doesn’t actually have a physical presence in a particular location (ie you don’t actually have a building / office / shop in the area you claim to), you are more than likely to be penalised for pretending you do. My advice is to only include NAP information for locations where you do have an actual physical presence. (Google is also wise to the fact that you may be listing yourself in eg a virtual office, when actually all that happens is mail gets redirected from there. Again, my advice would be to steer clear of this sort of thing).
4) Local Directory Listings
There are many internet directories that claim to have an influence with search engine rankings. However, there are only a small proportion of these that will actually have any real bearing on where you might be listed in the results for a relevant local search.
I’ve found there are approximately 80-90 UK-based directories that can help your rankings in Google. You should ensure that you try to gain a listing in each of them, with some of the more well-known ones being:
As well as some lesser known sites, such as:
As with your Google My Business Listing, you need to make sure the Name Address and Phone number information is identical in each (that is, identical to the NAP info on your site).
5) Inbound Links
Not surprisingly, considering we’re talking about search engine optimisation, inbound links to your site play an important factor. As well as the local directory listings as mentioned above, you should try to generate multiple quality inbound links from other sources.
In order to ensure local relevance, I recommend looking at things such as your local Chamber of Commerce or other business groups, for example the BNI or similar networking organisastions. An inbound link from one of these sites will help associate your business with the local area you’re targeting.
You can also look to get links from local clients – eg from their blog by asking them to mention they’ve just had some work done or bought something from you, with the blog post including a link to your site.
And you musn’t forget that, just because we’re focusing on Local SEO, it is still SEO we’re talking about, so quality links are always going to be an important factor.
6) On Page Optimisation
With SEO in mind, obviously it makes sense to ensure your “on page” factors are optimised as well as possible. These will include such things as the metadata on the pages (Title tag, Description tag), as well as the visible text content that site visitors can see.
Incorporating the geographical areas within the text of these elements will help from a local optimisation perspective, especially if you include specific local information, such as referencing surrounding town names and landmarks that are specific to the local vicinity.
You should try to get as many favourable reviews as possible in the major review sites, such as Trustpilot or Reviews.co.uk. You can encourage satisfied customers to write reviews on your behalf with a follow up email once they’ve completed their purchase.
The rule of thumb for 3rd party review sites is – the more good reviews the better, so long as they are genuine and from a site that Google is likely to recognise as an authority, rather than one simply set up to assist with a site’s rankings.
8) Social Media
Similar to getting good reviews, encouraging your customers to talk about you on social media sites such as Facebook and Twitter will help with your optimisation efforts. Especially useful is if the people discussing your business are located nearby to your premises, as that again helps associate you with the local area.
I was chatting with a long term client the other day and we went through my own history with promoting websites. He remarked that my own involvement must somewhat mirror the overall history of the discipline, making “my history” almost equivalent to “the history” of search engine marketing!
Whilst I wouldn’t necessarily go as far as that, I do think he had a point. There aren’t many people in the UK who’ve concentrated on the same field for as long as I have; and while I think about it, there really won’t be that many people in the whole world! A fairly sobering thought in one way, but also quite a pleasing one in another.
So I’ve decided to run through what I’ve done in the search marketing area since I started all those years ago, by way of outlining how things have changed and yet in some respects remained the same.
1998 – Internet Evangelist
When I first started out, the majority of my time was spent in “evangelical” mode, trying to actually convince business owners that the internet was a worthwhile thing for them to be involved with. Seems ridiculous now, but I remember being told on many occasions that people were going to wait and see what happened before they committed to getting involved with the world wide web.
My own focus in the early days was on building websites – a natural consequence of businesses not yet having embraced the medium as being a useful sales and marketing tool – with several local businesses “benefitting” from my design expertise. I’ve put the word “benefitting” in speech marks, as my design skills left a little to be desired.
I actually started off using Microsoft FrontPage to build sites – a product which has been discontinued for almost 10 years now! My recollection of FrontPage 98 is that it was an extremely unwieldy piece of software that necessitated having 2 different screens open in order to generated anything resembling a web page. There was the FrontPage Editor element, which allowed you to generate code in what they laughably described as a WYSIWYG manner (What You See Is What You Get); which had to be linked up to the separate FrontPage Explorer – which was some kind of file management system. I never quite got my head around the reason why there had to be this separation, but I guess it was still the fairly early days when it came to web site development.
There was also a free version of FrontPage that was available with limited functionality. I did use this for the odd alteration to a website due to it being easier to use – as I remember, it didn’t have the clumsy double interface system of the full software.
Suffice to say, though, that my early experiments with web design were based on my having a sales and marketing background, rather than one in design. Whilst I stand by this even today – ie that websites are a sales tool first and foremost and should be built for this purpose, rather than simply to “look nice” – the element that was sadly lacking in my earliest designs was any sense that they had been put together by someone with a “good eye”.
I had, however, determined that any business that had a web presence would need some means of their potential customers finding them. To that end, each of the sites I originally built had been designed with search engines in mind. I certainly wasn’t using the term “search engine optimisation” in those days, having decided that “web promotion” was a conveniently descriptive term for what later became known as SEO.
Interestingly, a little while after I embarked on my internet marketing career, two PHD students at Stanford University incorporated a business that they decided to call “Google”. Stick around for the remainder of this post and you just may find I discuss that particular organisation again at some point…
AltaVista – King of the Search Engines
You may be familiar with the famous poem, Ozymandias, by Shelley? Essentially, the theme of the poem (written in the style of a sonnet) is of the way history treats great rulers and the empires they oversee.
The most famous lines from the poem, often quoted, are:
‘My name is Ozymandias, king of kings:
Look on my works, ye Mighty, and despair!’
In the poem, these words are inscribed on a statue to the great emperor – contrasting how he believed himself to be extraordinarily important in the world, yet the modern day traveller who comes across the statue has never even heard of him.
It may well be that AltaVista didn’t consider itself to be the “king of kings” when it came to search – but my recollection is that it was certainly the main one we were all trying to get our sites listed with in the late 90s.
Actually, when I first looked into how to get my sites listed in the main search engines for relevant terms, there were loads of them all trying to compete in the space. Nobody had quite worked out how to capitalise on the idea of search in terms of making big money, but they all knew they wanted to get a bigger piece of the pie for whenever the inevitable monetization light bulb went off.
Some of the search engines I was dealing with at the time that you may not be familiar with, or if you did know them have almost certainly forgotten about, were:
But the main one in those days was undoubtedly AltaVista.
Due to the number of different search engines that were around – and the possibility that any one of them could eventually come out on top as being the “go to” search engine of choice for web surfers* – my job was complicated by the fact that there appeared to be different rules for each one when it came to achieving good rankings.
* surfing the net is a phrase that has fallen out of fashion, but was all the rage in those days!
This was when I first learned about a phrase that people with even a passing interest in SEO have come across – “metadata”, a literal translation for which is “data about data”. If you imagine a library (which is essentially what a search engine is, though its contents are websites rather than physical books), there will need to be a system for identifying where the books are located in order that people can find them easily.
Books will, therefore, have descriptions on their back covers, which outline what’s contained within the pages. A table of contents and index section further assist with locating specific items of information within the book itself. These elements – the description, table of contents and index – are examples of metadata, as they are literally data that is about other data. (In this case, for example, the table of contents providing data – information – about where to find the rest of the data – information – in the book’s pages).
The basic elements of metadata from a web page perspective – Title tag, Description tag, Keywords tag – have actually remained the same in terms of what their function is. My job was to try and work out which search engine required more emphasis on which of these elements in order to try and gain a higher ranking in that particular engine’s listings.
Obviously, as time has gone on, the emphasis on the importance of metadata has shifted. The Keywords tag, for example, is almost entirely worthless nowadays – having started out as being the “secret ingredient” that could help a site rise to the top.
The Title tag is still an important factor – though not as much as it used to be – and the Description tag has been in and out of favour throughout the last 17 years, but now has found a pretty stable value for itself as a kind of default snippet of information that can be used in both search results and social media references.
And there are other elements that have had an influence on search positioning over the years, including some of the more recognised ones such as heading tags (eg <h1>, <h2>), and alt tags (the information included to describe the content of an image for those who either can’t see it or are browsing with images turned off).
But ultimately, the fundamental issue that has dominated search engine optimisation since even the days of AltaVista is that success is determined by the visible content of a web page. (I’ll be returning to the complementary issue of link building in a future post).
Monetizing Search with Paid Listings
The history of Paid Search is one I’m not going to go into in great depth, save that I was familiar with the original Paid Search engine, GoTo.com, when it launched in 1998. It subsequently morphed into Overture.com, which was eventually bought by Yahoo.
While we’re on the subject, Yahoo was a slightly different beast from the other search engines, in that it started out as a Directory. There was an editorial review procedure before your site could become listed – something that site owners eventually had to pay for, which was Yahoo’s original attempt at generating revenue from its service.
(Certain free competitor directories attempted to knock Yahoo of its Number 1 Directory perch – eg DMOZ, a service provided by the Open Directory Project, which was taken into the ownership of AOL after it had bought Netscape).
By far the biggest game changer, though, was the introduction of Google AdWords (told you I’d probably get round to mentioning them again) – which went on to become the massive moneyspinner that allows Google to enjoy the kind of multi billion dollar revenues it has today.
The earliest iteration of AdWords that I used was on a pay per impression model and one where you could pay to be in a particular slot on the page – obviously everyone trying to be at the top – and stay there for a month for a specific sum. This worked out to be a bit of a bargain for some people – especially in the early days of the “online Viagra” markets, as paying for that slot was a lot cheaper than having to pay each time someone clicked an ad. (I should point out that, whilst I was involved in plenty of these type of sites for promotional activities, the law eventually changed and the online pharmacy industry became properly regulated around 10 years ago).
The big game changing factor for Google was its adoption of Overture’s basic concept of Pay Per Click. In the Overture model, anyone who wished to bid higher than the next highest bidder would end up at the top of the listings – a pure auction system.
What Google did, though, was to introduce the Ad Rank formula, which essentially rewarded the better performing ads with higher positions on the page, without those advertisers having to pay more than they wished to. (ie if your ad got more clicks, you would rise up the page and still not have to pay more than your competitors who had a lower Click Through Rate).
So as well as SEO, PPC then became a very big part of the search marketing world from around 2003 onwards. Indeed, nowadays, with approximately 50% of all searches being performed on mobile phones, AdWords is increasingly important, as the organic (no-paid) listings are unlikely to even be seen on a mobile phone search results page, with the top listings all being AdWords.
I’ve been involved with the spending of hundreds of thousands of pounds on paid search over the years – perhaps most notably, when I ran the UK AdWords campaign for the launch of Microsoft Office 2007 – and have seen it develop from a relatively simple system into the enormously complex and sophisticated range of different services on offer in 2015. (Facebook and Twitter, for instance – still the new kids on the block in internet terms – both offer a PPC advertising service, with Facebook in particular providing a very targetable range of demographic settings).
Throughout that time, I’ve come to respect Pay Per Click advertising as the most sophisticated and controllable form of advertising yet devised. (A far cry from the days when advertisers would sit in a car near their billboard poster, counting the number of people who went past in order to determine how many “eyeballs” their advert was receiving!)
Red Herrings and Black Hats
Of course, with such a long history in search positioning, I’ve come across my fair share of “black hat” methods of trying to fool the search engines. (These methods supposedly get their name from the old TV westerns, where the “baddie” would usually be the one in the black hat, the good guy wearing a white hat).
And there have been plenty of “fool’s gold” magic bullets on offer over the years, whether they be doorway pages, random content generators, run of site links, article links, keyword stuffing etc etc.
But despite all the secret techniques, the solid foundation of what I was doing for SEO back in 1998 – solid metadata with quality visible content – is pretty much what Google still wants to see, which is why my clients are still being rewarded with good rankings and search traffic.