Browsed by
Tag: seo

Internet Marketing Blog

Internet Marketing Blog

Throughout 2016, I was working on a project which prevented me from blogging here on the Intergration site, due to a potential conflict of interest.

I’ve also recently evolved my own service offering, such that most of the future blog posts that would have been posted here originally, are now going to be posted on my consultancy services site. This is where you can now see the first one – AdWords Copywriter – about the necessity of having a copywriter write your AdWords text, rather than a statistics specialist.

I anticipate posting on this blog again, but probably in a different format from how I was doing it before, with the majority of my future service related posts going on the site.




The “SEO is Dead” Myth

The “SEO is Dead” Myth

I’ve been promoting websites since 1998 – before we even used the term Search Engine Optimisation. Back then it was far from clear which of the multiple search engines would rise to enjoy the kind of dominance that the mighty Google now enjoys, with multiple sites competing for your searching time. (Lycos, Infoseek, AltaVista etc etc).

Read More Read More

Why Should You be Top of Google?

Why Should You be Top of Google?

In all the years I’ve been promoting websites (17 so far and counting), one of the things I get asked most often is to get a particular site to “the top of Google”. Now, over the years there have been various ideas as to what the “top” of Google actually is – with Top Ten, Top Five and Top Three often being regarded as a very good runner up to that coveted Number 1 slot.

Read More Read More

Some Essentials for SEO

Some Essentials for SEO

There are many things which can go into a successful SEO project – even in these days of content marketing and quality links. But overall, there are a few essential elements that I recommend for all sites in order for them to be search engine friendly.

1) Metadata

The forgotten relation in SEO. Whilst description tags and the like are not as important as they were when I first started optimising sites – way back in the “dark ages” of the late 1990s – there are still benefits to be had from getting your metadata correct in terms of best practices for SEO. What this requires is:

– Title Tag

Still one of the most important elements in “on page” SEO – your title tag should reflect the keywords that are likely to be used to search for the content of the page. These words will be used as the heading for your listing in the Google search results pages.

– Description Tag

Not used for ranking purposes, but should appear underneath the heading in Google, as determined by the Title Tag mentioned above. The copy you use here can be very useful for convincing people to click through to your site, especially if it not only matches the keywords they might use, but also provides them a good reason to visit your site. For example, you can promote your USPs here to give people a compelling reason to click the listing.

2) Responsive Design

Google is very keen to see that your website will work well on any device (see this post on the Google Mobile Algorithm Update for more info). It may not be the case yet that your site will be penalised for not being responsive, but my bet is that this will start to happen before too long.

So you need to ensure that your site’s layout is adaptable to the device it is being viewed on – and not just from the point of view of it changing shape. You also need to ensure the site is navigable and user friendly, too, in order to keep people on the site.

3) Stickiness

A term that is not so much in favour nowadays compared to 10 years or so ago, stickiness refers to the capability of your site to keep people engaged whilst they’re visiting it. My recommendation is that you ensure each page has something of value on it that will keep people reading (or watching a video).

Google certainly doesn’t want to be sending its users to a site that only holds a visitor’s attention for a few seconds before they click back to try and find another, more suitable match for their search query. So you should populate your site with quality, useful content and keep those visitors (and Google) happy.

Whilst there are certainly lots of other things you can do – as previously mentioned – if you concentrate your efforts on the 3 issues listed above, you’ll already be doing pretty well in terms of SEO effectiveness.

My History of Search Engine Marketing

My History of Search Engine Marketing

I was chatting with a long term client the other day and we went through my own history with promoting websites. He remarked that my own involvement must somewhat mirror the overall history of the discipline, making “my history” almost equivalent to “the history” of search engine marketing!

Whilst I wouldn’t necessarily go as far as that, I do think he had a point. There aren’t many people in the UK who’ve concentrated on the same field for as long as I have; and while I think about it, there really won’t be that many people in the whole world! A fairly sobering thought in one way, but also quite a pleasing one in another.

So I’ve decided to run through what I’ve done in the search marketing area since I started all those years ago, by way of outlining how things have changed and yet in some respects remained the same.

1998 – Internet Evangelist

When I first started out, the majority of my time was spent in “evangelical” mode, trying to actually convince business owners that the internet was a worthwhile thing for them to be involved with. Seems ridiculous now, but I remember being told on many occasions that people were going to wait and see what happened before they committed to getting involved with the world wide web.

My own focus in the early days was on building websites – a natural consequence of businesses not yet having embraced the medium as being a useful sales and marketing tool – with several local businesses “benefitting” from my design expertise. I’ve put the word “benefitting” in speech marks, as my design skills left a little to be desired.

I actually started off using Microsoft FrontPage to build sites – a product which has been discontinued for almost 10 years now! My recollection of FrontPage 98 is that it was an extremely unwieldy piece of software that necessitated having 2 different screens open in order to generated anything resembling a web page. There was the FrontPage Editor element, which allowed you to generate code in what they laughably described as a WYSIWYG manner (What You See Is What You Get); which had to be linked up to the separate FrontPage Explorer – which was some kind of file management system. I never quite got my head around the reason why there had to be this separation, but I guess it was still the fairly early days when it came to web site development.

There was also a free version of FrontPage that was available with limited functionality. I did use this for the odd alteration to a website due to it being easier to use – as I remember, it didn’t have the clumsy double interface system of the full software.

Suffice to say, though, that my early experiments with web design were based on my having a sales and marketing background, rather than one in design. Whilst I stand by this even today – ie that websites are a sales tool first and foremost and should be built for this purpose, rather than simply to “look nice” – the element that was sadly lacking in my earliest designs was any sense that they had been put together by someone with a “good eye”.

I had, however, determined that any business that had a web presence would need some means of their potential customers finding them. To that end, each of the sites I originally built had been designed with search engines in mind. I certainly wasn’t using the term “search engine optimisation” in those days, having decided that “web promotion” was a conveniently descriptive term for what later became known as SEO.

Interestingly, a little while after I embarked on my internet marketing career, two PHD students at Stanford University incorporated a business that they decided to call “Google”. Stick around for the remainder of this post and you just may find I discuss that particular organisation again at some point…

AltaVista – King of the Search Engines

You may be familiar with the famous poem, Ozymandias, by Shelley? Essentially, the theme of the poem (written in the style of a sonnet) is of the way history treats great rulers and the empires they oversee.

The most famous lines from the poem, often quoted, are:

‘My name is Ozymandias, king of kings:
Look on my works, ye Mighty, and despair!’

In the poem, these words are inscribed on a statue to the great emperor – contrasting how he believed himself to be extraordinarily important in the world, yet the modern day traveller who comes across the statue has never even heard of him.

It may well be that AltaVista didn’t consider itself to be the “king of kings” when it came to search – but my recollection is that it was certainly the main one we were all trying to get our sites listed with in the late 90s.

Actually, when I first looked into how to get my sites listed in the main search engines for relevant terms, there were loads of them all trying to compete in the space. Nobody had quite worked out how to capitalise on the idea of search in terms of making big money, but they all knew they wanted to get a bigger piece of the pie for whenever the inevitable monetization light bulb went off.

Some of the search engines I was dealing with at the time that you may not be familiar with, or if you did know them have almost certainly forgotten about, were:

Northern Light
UK Search
AOL Search

But the main one in those days was undoubtedly AltaVista.

Due to the number of different search engines that were around – and the possibility that any one of them could eventually come out on top as being the “go to” search engine of choice for web surfers* – my job was complicated by the fact that there appeared to be different rules for each one when it came to achieving good rankings.

* surfing the net is a phrase that has fallen out of fashion, but was all the rage in those days!

This was when I first learned about a phrase that people with even a passing interest in SEO have come across – “metadata”, a literal translation for which is “data about data”. If you imagine a library (which is essentially what a search engine is, though its contents are websites rather than physical books), there will need to be a system for identifying where the books are located in order that people can find them easily.

Books will, therefore, have descriptions on their back covers, which outline what’s contained within the pages. A table of contents and index section further assist with locating specific items of information within the book itself. These elements – the description, table of contents and index – are examples of metadata, as they are literally data that is about other data. (In this case, for example, the table of contents providing data – information – about where to find the rest of the data – information – in the book’s pages).

The basic elements of metadata from a web page perspective – Title tag, Description tag, Keywords tag – have actually remained the same in terms of what their function is. My job was to try and work out which search engine required more emphasis on which of these elements in order to try and gain a higher ranking in that particular engine’s listings.

Obviously, as time has gone on, the emphasis on the importance of metadata has shifted. The Keywords tag, for example, is almost entirely worthless nowadays – having started out as being the “secret ingredient” that could help a site rise to the top.

The Title tag is still an important factor – though not as much as it used to be – and the Description tag has been in and out of favour throughout the last 17 years, but now has found a pretty stable value for itself as a kind of default snippet of information that can be used in both search results and social media references.

And there are other elements that have had an influence on search positioning over the years, including some of the more recognised ones such as heading tags (eg <h1>, <h2>), and alt tags (the information included to describe the content of an image for those who either can’t see it or are browsing with images turned off).

But ultimately, the fundamental issue that has dominated search engine optimisation since even the days of AltaVista is that success is determined by the visible content of a web page. (I’ll be returning to the complementary issue of link building in a future post).

Monetizing Search with Paid Listings

The history of Paid Search is one I’m not going to go into in great depth, save that I was familiar with the original Paid Search engine,, when it launched in 1998. It subsequently morphed into, which was eventually bought by Yahoo.

While we’re on the subject, Yahoo was a slightly different beast from the other search engines, in that it started out as a Directory. There was an editorial review procedure before your site could become listed – something that site owners eventually had to pay for, which was Yahoo’s original attempt at generating revenue from its service.

(Certain free competitor directories attempted to knock Yahoo of its Number 1 Directory perch – eg DMOZ, a service provided by the Open Directory Project, which was taken into the ownership of AOL after it had bought Netscape).

By far the biggest game changer, though, was the introduction of Google AdWords (told you I’d probably get round to mentioning them again) – which went on to become the massive moneyspinner that allows Google to enjoy the kind of multi billion dollar revenues it has today.

The earliest iteration of AdWords that I used was on a pay per impression model and one where you could pay to be in a particular slot on the page – obviously everyone trying to be at the top – and stay there for a month for a specific sum. This worked out to be a bit of a bargain for some people – especially in the early days of the “online Viagra” markets, as paying for that slot was a lot cheaper than having to pay each time someone clicked an ad. (I should point out that, whilst I was involved in plenty of these type of sites for promotional activities, the law eventually changed and the online pharmacy industry became properly regulated around 10 years ago).

The big game changing factor for Google was its adoption of Overture’s basic concept of Pay Per Click. In the Overture model, anyone who wished to bid higher than the next highest bidder would end up at the top of the listings – a pure auction system.

What Google did, though, was to introduce the Ad Rank formula, which essentially rewarded the better performing ads with higher positions on the page, without those advertisers having to pay more than they wished to. (ie if your ad got more clicks, you would rise up the page and still not have to pay more than your competitors who had a lower Click Through Rate).

So as well as SEO, PPC then became a very big part of the search marketing world from around 2003 onwards. Indeed, nowadays, with approximately 50% of all searches being performed on mobile phones, AdWords is increasingly important, as the organic (no-paid) listings are unlikely to even be seen on a mobile phone search results page, with the top listings all being AdWords.

I’ve been involved with the spending of hundreds of thousands of pounds on paid search over the years – perhaps most notably, when I ran the UK AdWords campaign for the launch of Microsoft Office 2007 – and have seen it develop from a relatively simple system into the enormously complex and sophisticated range of different services on offer in 2015. (Facebook and Twitter, for instance – still the new kids on the block in internet terms – both offer a PPC advertising service, with Facebook in particular providing a very targetable range of demographic settings).

Throughout that time, I’ve come to respect Pay Per Click advertising as the most sophisticated and controllable form of advertising yet devised. (A far cry from the days when advertisers would sit in a car near their billboard poster, counting the number of people who went past in order to determine how many “eyeballs” their advert was receiving!)

Red Herrings and Black Hats

Of course, with such a long history in search positioning, I’ve come across my fair share of “black hat” methods of trying to fool the search engines. (These methods supposedly get their name from the old TV westerns, where the “baddie” would usually be the one in the black hat, the good guy wearing a white hat).

And there have been plenty of “fool’s gold” magic bullets on offer over the years, whether they be doorway pages, random content generators, run of site links, article links, keyword stuffing etc etc.

But despite all the secret techniques, the solid foundation of what I was doing for SEO back in 1998 – solid metadata with quality visible content – is pretty much what Google still wants to see, which is why my clients are still being rewarded with good rankings and search traffic.

The Benefit of Longer Blog Posts

The Benefit of Longer Blog Posts

Having worked in internet marketing for many years, with most of that time spent studying and practicing the art of search engine optimisation, I’m very familiar with the phrase “content is king”. This was the mantra of every SEO practitioner way back in the early 2000’s, before everyone became obsessed with link building as a result of the dominance of Google. (One of Google’s main criteria for assuming a web page is relevant to a particular search query is the quantity and quality of the inbound hyperlinks that the page has attracted).

In recent years, the focus on content has made something of a comeback – especially following the Armageddon-style devaluing of particular types of links that occurred with the various alterations to Googles algorithm known as the “Panda” updates. So we now see a vast array of “content marketing experts” who recommend the development of quality content that can then be promoted around the internet. (The primary purpose being, of course, to attract more inbound links in order to satisfy Google in this regard).

Indeed, web pages that have attracted a large number of inbound links but don’t appear to have much going on in the way of text content are certainly well represented when it comes to Google rankings. A good example of this is the site – which always appears well ranked for relevant phrases such as “currency exchange”, “currency converter” etc – though there fewer than 100 words of text on their home page.

So you wouldn’t be off the mark in assuming that links are still “where it’s at” when it comes to search engine optimisation for good rankings. However, it is also the case that quality text content is a factor in determining a page’s ranking position, as Google is keen to reward a searcher with something of value when they click on of its search results. Given that searches are performed using words, it is thus only natural that the pages most likely to be recommended by Google are those that feature content that includes and expands upon the words in the search phrase being used.

In the olden days (late 90s, early 2000’s), keyword stuffing of web pages was a prevalent method of achieving good ranking results, with many search engine specialists recommending a particular “keyword density” in a page’s text content in order to convince Google that your web page was a good match for the search phrase or phrases being targeted.

And certainly it is still true that featuring a specific set of words in a particular order is more likely to see your page being featured in the results for a search that uses those same words in the same order. But nowadays there is much more to it than that, with keyword density and keyword stuffing essentially being consigned to the past.

Which all adds up to providing a very good reason to make your blog posts longer in length than what appears to be the de facto standard of around 300 words. Writing about a particular subject in an expanded fashion will definitely assist with your search engine placement efforts, so should certainly be something you consider when putting posts together.

And as an SEO expert myself, I have no problem advising people to write longer blog posts for that reason alone.

There is another very good reason to write longer posts (that just so happens to also assist with good rankings), in that longer posts are much more likely to be read for a longer time period than shorter posts. Not exactly rocket science, this one, as clearly it takes longer to read something of 1000 words than it does to read something of a couple of hundred words. The length of time someone stays on your page having arrived at it via a Google search is one of the known elements for helping a page to climb the rankings, so encouraging people to stay on the page for longer is another beneficial result from an SEO perspective.

But it’s not just good for rankings to encourage people to stay on your site for longer, it’s also good for conversions, as you can get your message across multiple times in the same post. This should help to persuade the people who are reading the post that your message is a worthwhile one, which is obviously the whole point of saying something in a blog post in the first place.

My recommendation is to aim for posts of 800+ words. So a post of 1000 words – such as this one – is not only a good bet, but it also appeals to my liking for rounded figures. (1000 words thus being a more pleasing length than 800!).

Even better still is a post of greater than 1000 words. (The entry level for my ongoing search engine optimisation service actually includes an 800-2000 word blog post each month, alongside Links Outreach, which is outside the scope of this particular article, but is no doubt something I’ll be returning to in the future).

There have been several studies of Google’s top ten rankings (though I’ll be returning to the idea of whether there even is such a thing as a “top ten ranking” at a later stage!) that indicate blog posts of 800-2000 words dominate the results pages. And my own research does back up these findings a little. I wouldn’t go so far as to say it is essential to write posts of this length, though, as I have also been responsible for creating many blog posts of 1000 words – and sometimes less – that enjoy similar rankings success.

To sum up, my suggestion is to write 800-2000 word blog posts wherever possible – both for SEO purposes and to drive your message home more forcefully than a shorter post could. But if you can only think of enough to say to fill a 300-400 word post, that’s fine, too. As there’s nothing more likely to make your site visitors bounce straight away from your page than to fill it with padding and waffle. (Another sure way to get Google to downgrade your page in its listings, too).

So to revise the well-known phrase “content is King”, I’d add in an extra word, which is essentially what I’m promoting here – “quality content is King”. If you can write 800-2000 words of high quality content, your blog posts will always achieve their goals.

Google Mobile Search Algorithm Update

Google Mobile Search Algorithm Update

Google recently announced a change to its mobile search algorithm that affects ALL websites, not just mobile sites:

Essentially, what it means is that, from April 21st this year, Google will be downgrading sites in the search results of mobile devices (ie anyone searching Google on a mobile phone), if they are not “mobile friendly”. And, consequentially, upgrading sites in the search results of mobile devices that are “mobile friendly”.

So if your site isn’t suitable – in Google’s opinion – to be viewed on a mobile device, you will have no chance of your site appearing in Google’s search results on mobile devices. Whilst you may think this isn’t so important for you, as most of your traffic comes from conventional desktop / laptop machines, you should bear in mind that the definition of a “mobile device” will include tablets – something that more and more people are using for their general access to the internet. Plus, accessing the internet on a mobile phone is an increasing share of all internet traffic, so you will be leaving a large proportion of internet users out of your potential market if you don’t have a “mobile friendly site”.

How do I Know if my Site is Mobile Friendly?

Google have provided an online tool for checking your site to determine how mobile friendly it is:

They have also produced a set of guidelines that you should understand in order to make sure your site passes their test:

As well as providing for different screen sizes, it should be noted that their guidelines include such issues as ensuring the clickable links are not too close together.

Mobile Version of Site or Responsive Design?

There are currently 2 options for ensuring your site is mobile friendly:

1) Set up a separate mobile version of the site, with device recognition capability so the correct version of the site is sent to the correct device. (ie people viewing on a desktop PC will see the “normal” version of the site, people viewing on a tablet or a mobile phone will see the “mobile” version of the site).

2) Make the site “responsive”, such that it changes design and layout based on the device being used to view it. (The page you’re currently reading now is “responsive”, as it adjusts itself to fit the screen size you’re using, rather than sending you to a separate mobile version).

My Recommendation

Whilst Google is currently saying it is fine to have 2 versions of your site aimed at different devices, my recommendation is that you aim for your site being responsive, rather than having different versions.

My reasoning is based on years of experience with Google and the fact that they are generally opposed to having different versions of the same site in their index.

Certainly, if you don’t wish to make your site fully responsive just yet – for reasons of eg budget, future redesign schedule – then you should be OK in the short term to simply have a mobile version of the site that works alongside the main site.

However, for long term peace of mind, I’d definitely recommend making your site responsive, in order to pre-empt the likely negative effect on all  rankings (mobile and non-mobile) that I predict for the future for non-responsive sites.

One thing to bear in mind here is that, not only will your site need to adjust its layout to fit the smaller device screens, it should also make sense from a marketing design point of view – ie you should ensure your name, contact details, main sales message etc are visible on the home screen for anyone viewing your site, rather than being shunted away as a result of the responsive design parameters.

Get in touch to see how I can help optimise your website in line with the updated algorithm – Contact

Google Update Chatter

Google Update Chatter

There’s often a lot of talk amongst SEO professionals about Google updates and potential updates, with many of them seemingly spending their whole working day analysing the tiniest variations in the data they see to determine whether “the big G” has made an update or not.

Indeed, when Google first became the dominant force in the world of SEO, there were multiple websites and forums dedicated to trying to detect what used to be known as the “Google Dance” – a regular revising of the Search Engine Results Pages that saw a site rise or fall in the rankings.

Things have definitely changed in some respects – the Google Dance being a thing of the past due to ongoing change and non-universal SERPs rankings – but not in others, as there are still lots of people determined to pick up on a Google algorithm change and post about it first.

Each to their own, but with Google making almost continual changes, I prefer to stick to the tried and tested principle that has worked throughout the history of search engines:

Content is King

– a statement that will attract derision from those who are obsessed with minute data variations, but will see the most successful internet marketers nodding their heads in agreement.

Internet Marketing hits the Mainstream

Internet Marketing hits the Mainstream

With the announcement of this year’s winner of the TV reality show The Apprentice, it seems that internet marketing has become somewhat more mainstream than it used to be. I received no fewer than 4 text messages from people asking me if Mark Wright, the newly-crowned winner, was planning to setup a business that operated in the same field as me. And yes, it seems that Lord Alan Sugar’s new business partner is going to be trying to develop a business that offers internet marketing assistance to small and medium sized companies.

MarkWright{Photo Credit: BBC web page about The Apprentice}

So is it possible to develop a truly scaleable internet marketing business in the manner Mark Wright wishes to? I’d have to say “possibly” is the only answer I can provide here. My own experience suggests that the personal touch is a key element in any relationship between business owners and their internet marketing consultants. Sub contractors have their place and can be useful, but the main point of contact needs to be up to speed with all the latest developments in SEO (to help get sites “up the pecking order” as Lord Sugar calls it) and other digital marketing elements.

I certainly wish Mark well and hope he can justify the investment he’s secured, but I just wonder if the model is going to have to change somewhat in order to be successful. In the same way that I have to invest a lot of my own time in ensuring my clients are well-served by my efforts, I feel that Mark may have to surround himself with a team of equally experienced experts that could ultimately prove too expensive to evolve into anything other than a boutique digital agency targeting higher value clients, rather than the small business end of the market he wants to pitch himself into from the outset. (Small and medium sized businesses comprising the market I’m extremely familiar and successful with myself).

Seeing Search Engine Optimisation being talked about on one of the country’s favourite TV shows did make me think that we’ve come a long way from the days of 1998, when I first started in the industry and had to spend most of my time evangelically promoting search engines as a business tool, as back then people were far from convinced that the internet was here to stay!

So congratulations to Mark Wright, this year’s Apprentice – and a Merry Christmas and Happy New Year to one and all.