Browsed by
Author: intergration

Link Building Outreach Tips

Link Building Outreach Tips

Back in the day, people used to share links on the internet because they actually thought the site being linked to might provide some value to people who read it. (Crazy concept, eh?). People would go out of their way to build up Links pages and Resource pages that included all sorts of sites the site owner had found, that they believed to provide useful information.

Obviously, the fact that someone was visiting their site in the first place made the webmaster believe that person was probably interested in the subject matter of their site. For example, if a site was about shark fishing, the person who managed the site would be fairly likely to assume that anyone visiting was going to be interested in fishing for sharks. So they’d populate their Links page with sites that were likely to also prove of interest to anyone interested in this particular topic.

OK, so far so what?

Well, when the mighty Google came along (Google being one who must always be obeyed, when it comes to SEO), and decided that inbound links to a site would make up a substantial part of its fabled “algorithm”, the Links landscape changed forever – sort of…

Providing site links was now no longer the preserve of friendly website owners who were trying to be helpful to their site visitors, it now attracted a whole new breed of “link builders” whose sole purpose was to try and fool Google into thinking a particular site should be ranked higher simply down to the volume of links it had attracted from other sites.

A Quick Explanation of why Inbound Links Matter to Google

The 2 students who setup Google in the first place decided they’d incorporate one of the central tenets of academic credibility into their search engine’s algorithm. That element being citations from recognised authorities. (Academic papers always include references to other papers from which the main arguments have been taken – which gives the paper credibility and authority, as it shows the content is based on other people’s work and so gives it an air of reliability and veracity).

So Google incorporated a large element of measuring inbound links to a particular web page in order to determine whether it was deemed to be a “good” page about a particular topic. The theory being that these inbound links would serve as “citations” from the original page to the one being linked to – thus providing evidence of the quality of this page.

The basic principle enshrined in the algorithm in the early days of Google is the main reason why SEO experts focused so much of their time on link building. And it’s still something that you can capitalise on today – so long as you go about it the right way.

Here’s a rundown of the 3 best types of links you can generate for your site, that will be Google-safe and actually beneficial for traffic and sales:

Resource Page Links

Utilised in the very earliest days of the internet to help people find sites that are likely to be of interest to them, Resource page links are still a very valuable link to attain. When a site has been put together that focuses on a particular topic, it makes sense for their Resource page links to be related to that topic in some way. These are exactly the type of links that Google likes to see – contextual, relevant links that are given for the purpose of helping a site’s visitors, rather than simply to aid with another site’s rankings. (Though, of course, this rankings boost is a happy side effect of having this type of link to your site).

So where can you find these Resource pages?

A good start is to use Google itself. Searches such as:

“topic keyword” + “resource page”
“topic keyword” + “resources”
“topic keyword” + “links page”
“topic keyword” + “links”

Will provide plenty of sites that feature the kind of links page we’re after – each of which will be related to the “topic keyword”. (So obviously you should ensure the “topic keyword” you search for is relevant to the content of your site – the one you want to obtain a link for).

You should make a list of sites that appear suitable, checking each one to see whether you think it is of decent enough quality – something you should be able to determine simply from your gut instinct. ie if you find yourself thinking “this site is a bit rubbish”, it’s probably not one that you should try to get a link from.

Contact each of the sites on your list with an email that compliments the site owner on their useful resource, and also introducing them to the fact that your own site will probably be something their site visitors could benefit from seeing, so could they please update their Resources page with a link to your site? The ideal method for being successful in generating multiple links is to have developed some great content – my suggestion being a collection of great blog articles, at least one of which you can draw the site owner’s attention to – which is more likely to find favour when it comes to providing a link.

Resource Page Broken Links

An even more successful method for generating links is to point out to the site owner where a link they have on their Resources page is broken – ie because the site no longer exists or the URL has changed.

Informing them that they have a broken link which needs to be removed / updated, plus providing a new site they can link to, is generally going to find favour with most site owners, as it helps them out and keeps their site looking current.

Editorial Links

The ultimate in Google safe links is to attract a link to your site simply because your content is so good that people genuinely want to link to it of their own accord. Anyone who’s tried any link building in the past may be sceptical of this approach, but it actually does work.

Site owners are keen to provide good content for their visitors, so being able to introduce them to something of real value – your quality content in the form of a blog post etc – is something they will be keen to do.

My experience here suggests that “topic keyword” bloggers are the most likely to want to link to your content, particularly if it’s complementary to something they have recently written about.

Local SEO – Optimising your Site for Local Searchers

Local SEO – Optimising your Site for Local Searchers

One of the inevitable consequences of the rise of the smart phone is the increasing frequency of local searches. It stands to reason that, if you’re looking for something particular by performing a search on your mobile phone, it’s quite likely to be because you want the thing you’re looking for to be nearby (at least, nearby to where you are currently).

As well as this aspect, it’s a well recognised phenomenon that there have been an increasing number of searches based on a local area over the last 5 or so years, even amongst those people using desktop or laptop machines to perform their search.

So, if your business can be seen to have a local element – ie the majority of your customers are likely to come from the local area – you really need to be optimising your site to take advantage. Here’s how I recommend you go about it:

1) Name Address Phone Number (NAP)

You need to ensure your company name (or the name you trade under if you’re eg self employed, which could, of course, be your actual name), physical address and the phone number of your organisation is displayed on your site.

Some people suggest you should have each of these details on every page of your site – for instance in a footer – and I would certainly go along with that, especially as it can also help in terms of enquiries if people have read to the end of a page and then see your contact phone number whilst they already may have the thought in their mind to contact you.

You can simply have the NAP info on your home page and Contact page if you wish, but I would recommend featuring it on every page if it fits in with your site design.

2) Google My Business

Google has an irritating habit of changing the name of its local business listings service on an ongoing basis (eg Google Places being a previous incarnation), but for now it seems to have settled on Google My Business as being the name it’s happy to use.

You should ensure that you have a Google My Business listing, and equally ensure that the Name Address and Phone number details recorded in it are identical to the ones you feature on your site.

Further populate this listing with info such as Opening Times and accepted methods of payment, as well as ensuring you are listed in a relevant category, to give Google as much info as you can that could help with your local search listing.

3) Don’t try to Fool Google

If your business doesn’t actually have a physical presence in a particular location (ie you don’t actually have a building / office / shop in the area you claim to), you are more than likely to be penalised for pretending you do. My advice is to only include NAP information for locations where you do have an actual physical presence. (Google is also wise to the fact that you may be listing yourself in eg a virtual office, when actually all that happens is mail gets redirected from there. Again, my advice would be to steer clear of this sort of thing).

4) Local Directory Listings

There are many internet directories that claim to have an influence with search engine rankings. However, there are only a small proportion of these that will actually have any real bearing on where you might be listed in the results for a relevant local search.

I’ve found there are approximately 80-90 UK-based directories that can help your rankings in Google. You should ensure that you try to gain a listing in each of them, with some of the more well-known ones being:

192.com
Yell.com
118118.com
Scoot.co.uk
Thomsonlocal.com

As well as some lesser known sites, such as:

Tuggo.co.uk
UKsmallbusinessdirectory.co.uk
Near.co.uk
etc

As with your Google My Business Listing, you need to make sure the Name Address and Phone number information is identical in each (that is, identical to the NAP info on your site).

5) Inbound Links

Not surprisingly, considering we’re talking about search engine optimisation, inbound links to your site play an important factor. As well as the local directory listings as mentioned above, you should try to generate multiple quality inbound links from other sources.

In order to ensure local relevance, I recommend looking at things such as your local Chamber of Commerce or other business groups, for example the BNI or similar networking organisastions. An inbound link from one of these sites will help associate your business with the local area you’re targeting.

You can also look to get links from local clients – eg from their blog by asking them to mention they’ve just had some work done or bought something from you, with the blog post including a link to your site.

And you musn’t forget that, just because we’re focusing on Local SEO, it is still SEO we’re talking about, so quality links are always going to be an important factor.

6) On Page Optimisation

With SEO in mind, obviously it makes sense to ensure your “on page” factors are optimised as well as possible. These will include such things as the metadata on the pages (Title tag, Description tag), as well as the visible text content that site visitors can see.

Incorporating the geographical areas within the text of these elements will help from a local optimisation perspective, especially if you include specific local information, such as referencing surrounding town names and landmarks that are specific to the local vicinity.

7) Reviews

You should try to get as many favourable reviews as possible in the major review sites, such as Trustpilot or Reviews.co.uk. You can encourage satisfied customers to write reviews on your behalf with a follow up email once they’ve completed their purchase.

The rule of thumb for 3rd party review sites is – the more good reviews the better, so long as they are genuine and from a site that Google is likely to recognise as an authority, rather than one simply set up to assist with a site’s rankings.

8) Social Media

Similar to getting good reviews, encouraging your customers to talk about you on social media sites such as Facebook and Twitter will help with your optimisation efforts. Especially useful is if the people discussing your business are located nearby to your premises, as that again helps associate you with the local area.

My History of Search Engine Marketing

My History of Search Engine Marketing

I was chatting with a long term client the other day and we went through my own history with promoting websites. He remarked that my own involvement must somewhat mirror the overall history of the discipline, making “my history” almost equivalent to “the history” of search engine marketing!

Whilst I wouldn’t necessarily go as far as that, I do think he had a point. There aren’t many people in the UK who’ve concentrated on the same field for as long as I have; and while I think about it, there really won’t be that many people in the whole world! A fairly sobering thought in one way, but also quite a pleasing one in another.

So I’ve decided to run through what I’ve done in the search marketing area since I started all those years ago, by way of outlining how things have changed and yet in some respects remained the same.

1998 – Internet Evangelist

When I first started out, the majority of my time was spent in “evangelical” mode, trying to actually convince business owners that the internet was a worthwhile thing for them to be involved with. Seems ridiculous now, but I remember being told on many occasions that people were going to wait and see what happened before they committed to getting involved with the world wide web.

My own focus in the early days was on building websites – a natural consequence of businesses not yet having embraced the medium as being a useful sales and marketing tool – with several local businesses “benefitting” from my design expertise. I’ve put the word “benefitting” in speech marks, as my design skills left a little to be desired.

I actually started off using Microsoft FrontPage to build sites – a product which has been discontinued for almost 10 years now! My recollection of FrontPage 98 is that it was an extremely unwieldy piece of software that necessitated having 2 different screens open in order to generated anything resembling a web page. There was the FrontPage Editor element, which allowed you to generate code in what they laughably described as a WYSIWYG manner (What You See Is What You Get); which had to be linked up to the separate FrontPage Explorer – which was some kind of file management system. I never quite got my head around the reason why there had to be this separation, but I guess it was still the fairly early days when it came to web site development.

There was also a free version of FrontPage that was available with limited functionality. I did use this for the odd alteration to a website due to it being easier to use – as I remember, it didn’t have the clumsy double interface system of the full software.

Suffice to say, though, that my early experiments with web design were based on my having a sales and marketing background, rather than one in design. Whilst I stand by this even today – ie that websites are a sales tool first and foremost and should be built for this purpose, rather than simply to “look nice” – the element that was sadly lacking in my earliest designs was any sense that they had been put together by someone with a “good eye”.

I had, however, determined that any business that had a web presence would need some means of their potential customers finding them. To that end, each of the sites I originally built had been designed with search engines in mind. I certainly wasn’t using the term “search engine optimisation” in those days, having decided that “web promotion” was a conveniently descriptive term for what later became known as SEO.

Interestingly, a little while after I embarked on my internet marketing career, two PHD students at Stanford University incorporated a business that they decided to call “Google”. Stick around for the remainder of this post and you just may find I discuss that particular organisation again at some point…

AltaVista – King of the Search Engines

You may be familiar with the famous poem, Ozymandias, by Shelley? Essentially, the theme of the poem (written in the style of a sonnet) is of the way history treats great rulers and the empires they oversee.

The most famous lines from the poem, often quoted, are:

‘My name is Ozymandias, king of kings:
Look on my works, ye Mighty, and despair!’

In the poem, these words are inscribed on a statue to the great emperor – contrasting how he believed himself to be extraordinarily important in the world, yet the modern day traveller who comes across the statue has never even heard of him.

It may well be that AltaVista didn’t consider itself to be the “king of kings” when it came to search – but my recollection is that it was certainly the main one we were all trying to get our sites listed with in the late 90s.

Actually, when I first looked into how to get my sites listed in the main search engines for relevant terms, there were loads of them all trying to compete in the space. Nobody had quite worked out how to capitalise on the idea of search in terms of making big money, but they all knew they wanted to get a bigger piece of the pie for whenever the inevitable monetization light bulb went off.

Some of the search engines I was dealing with at the time that you may not be familiar with, or if you did know them have almost certainly forgotten about, were:

Excite
Lycos
Infoseek
Northern Light
UK Search
AOL Search

But the main one in those days was undoubtedly AltaVista.

Due to the number of different search engines that were around – and the possibility that any one of them could eventually come out on top as being the “go to” search engine of choice for web surfers* – my job was complicated by the fact that there appeared to be different rules for each one when it came to achieving good rankings.

* surfing the net is a phrase that has fallen out of fashion, but was all the rage in those days!

This was when I first learned about a phrase that people with even a passing interest in SEO have come across – “metadata”, a literal translation for which is “data about data”. If you imagine a library (which is essentially what a search engine is, though its contents are websites rather than physical books), there will need to be a system for identifying where the books are located in order that people can find them easily.

Books will, therefore, have descriptions on their back covers, which outline what’s contained within the pages. A table of contents and index section further assist with locating specific items of information within the book itself. These elements – the description, table of contents and index – are examples of metadata, as they are literally data that is about other data. (In this case, for example, the table of contents providing data – information – about where to find the rest of the data – information – in the book’s pages).

The basic elements of metadata from a web page perspective – Title tag, Description tag, Keywords tag – have actually remained the same in terms of what their function is. My job was to try and work out which search engine required more emphasis on which of these elements in order to try and gain a higher ranking in that particular engine’s listings.

Obviously, as time has gone on, the emphasis on the importance of metadata has shifted. The Keywords tag, for example, is almost entirely worthless nowadays – having started out as being the “secret ingredient” that could help a site rise to the top.

The Title tag is still an important factor – though not as much as it used to be – and the Description tag has been in and out of favour throughout the last 17 years, but now has found a pretty stable value for itself as a kind of default snippet of information that can be used in both search results and social media references.

And there are other elements that have had an influence on search positioning over the years, including some of the more recognised ones such as heading tags (eg <h1>, <h2>), and alt tags (the information included to describe the content of an image for those who either can’t see it or are browsing with images turned off).

But ultimately, the fundamental issue that has dominated search engine optimisation since even the days of AltaVista is that success is determined by the visible content of a web page. (I’ll be returning to the complementary issue of link building in a future post).

Monetizing Search with Paid Listings

The history of Paid Search is one I’m not going to go into in great depth, save that I was familiar with the original Paid Search engine, GoTo.com, when it launched in 1998. It subsequently morphed into Overture.com, which was eventually bought by Yahoo.

While we’re on the subject, Yahoo was a slightly different beast from the other search engines, in that it started out as a Directory. There was an editorial review procedure before your site could become listed – something that site owners eventually had to pay for, which was Yahoo’s original attempt at generating revenue from its service.

(Certain free competitor directories attempted to knock Yahoo of its Number 1 Directory perch – eg DMOZ, a service provided by the Open Directory Project, which was taken into the ownership of AOL after it had bought Netscape).

By far the biggest game changer, though, was the introduction of Google AdWords (told you I’d probably get round to mentioning them again) – which went on to become the massive moneyspinner that allows Google to enjoy the kind of multi billion dollar revenues it has today.

The earliest iteration of AdWords that I used was on a pay per impression model and one where you could pay to be in a particular slot on the page – obviously everyone trying to be at the top – and stay there for a month for a specific sum. This worked out to be a bit of a bargain for some people – especially in the early days of the “online Viagra” markets, as paying for that slot was a lot cheaper than having to pay each time someone clicked an ad. (I should point out that, whilst I was involved in plenty of these type of sites for promotional activities, the law eventually changed and the online pharmacy industry became properly regulated around 10 years ago).

The big game changing factor for Google was its adoption of Overture’s basic concept of Pay Per Click. In the Overture model, anyone who wished to bid higher than the next highest bidder would end up at the top of the listings – a pure auction system.

What Google did, though, was to introduce the Ad Rank formula, which essentially rewarded the better performing ads with higher positions on the page, without those advertisers having to pay more than they wished to. (ie if your ad got more clicks, you would rise up the page and still not have to pay more than your competitors who had a lower Click Through Rate).

So as well as SEO, PPC then became a very big part of the search marketing world from around 2003 onwards. Indeed, nowadays, with approximately 50% of all searches being performed on mobile phones, AdWords is increasingly important, as the organic (no-paid) listings are unlikely to even be seen on a mobile phone search results page, with the top listings all being AdWords.

I’ve been involved with the spending of hundreds of thousands of pounds on paid search over the years – perhaps most notably, when I ran the UK AdWords campaign for the launch of Microsoft Office 2007 – and have seen it develop from a relatively simple system into the enormously complex and sophisticated range of different services on offer in 2015. (Facebook and Twitter, for instance – still the new kids on the block in internet terms – both offer a PPC advertising service, with Facebook in particular providing a very targetable range of demographic settings).

Throughout that time, I’ve come to respect Pay Per Click advertising as the most sophisticated and controllable form of advertising yet devised. (A far cry from the days when advertisers would sit in a car near their billboard poster, counting the number of people who went past in order to determine how many “eyeballs” their advert was receiving!)

Red Herrings and Black Hats

Of course, with such a long history in search positioning, I’ve come across my fair share of “black hat” methods of trying to fool the search engines. (These methods supposedly get their name from the old TV westerns, where the “baddie” would usually be the one in the black hat, the good guy wearing a white hat).

And there have been plenty of “fool’s gold” magic bullets on offer over the years, whether they be doorway pages, random content generators, run of site links, article links, keyword stuffing etc etc.

But despite all the secret techniques, the solid foundation of what I was doing for SEO back in 1998 – solid metadata with quality visible content – is pretty much what Google still wants to see, which is why my clients are still being rewarded with good rankings and search traffic.

The Benefit of Longer Blog Posts

The Benefit of Longer Blog Posts

Having worked in internet marketing for many years, with most of that time spent studying and practicing the art of search engine optimisation, I’m very familiar with the phrase “content is king”. This was the mantra of every SEO practitioner way back in the early 2000’s, before everyone became obsessed with link building as a result of the dominance of Google. (One of Google’s main criteria for assuming a web page is relevant to a particular search query is the quantity and quality of the inbound hyperlinks that the page has attracted).

In recent years, the focus on content has made something of a comeback – especially following the Armageddon-style devaluing of particular types of links that occurred with the various alterations to Googles algorithm known as the “Panda” updates. So we now see a vast array of “content marketing experts” who recommend the development of quality content that can then be promoted around the internet. (The primary purpose being, of course, to attract more inbound links in order to satisfy Google in this regard).

Indeed, web pages that have attracted a large number of inbound links but don’t appear to have much going on in the way of text content are certainly well represented when it comes to Google rankings. A good example of this is the site www.xe.com – which always appears well ranked for relevant phrases such as “currency exchange”, “currency converter” etc – though there fewer than 100 words of text on their home page.

So you wouldn’t be off the mark in assuming that links are still “where it’s at” when it comes to search engine optimisation for good rankings. However, it is also the case that quality text content is a factor in determining a page’s ranking position, as Google is keen to reward a searcher with something of value when they click on of its search results. Given that searches are performed using words, it is thus only natural that the pages most likely to be recommended by Google are those that feature content that includes and expands upon the words in the search phrase being used.

In the olden days (late 90s, early 2000’s), keyword stuffing of web pages was a prevalent method of achieving good ranking results, with many search engine specialists recommending a particular “keyword density” in a page’s text content in order to convince Google that your web page was a good match for the search phrase or phrases being targeted.

And certainly it is still true that featuring a specific set of words in a particular order is more likely to see your page being featured in the results for a search that uses those same words in the same order. But nowadays there is much more to it than that, with keyword density and keyword stuffing essentially being consigned to the past.

Which all adds up to providing a very good reason to make your blog posts longer in length than what appears to be the de facto standard of around 300 words. Writing about a particular subject in an expanded fashion will definitely assist with your search engine placement efforts, so should certainly be something you consider when putting posts together.

And as an SEO expert myself, I have no problem advising people to write longer blog posts for that reason alone.

There is another very good reason to write longer posts (that just so happens to also assist with good rankings), in that longer posts are much more likely to be read for a longer time period than shorter posts. Not exactly rocket science, this one, as clearly it takes longer to read something of 1000 words than it does to read something of a couple of hundred words. The length of time someone stays on your page having arrived at it via a Google search is one of the known elements for helping a page to climb the rankings, so encouraging people to stay on the page for longer is another beneficial result from an SEO perspective.

But it’s not just good for rankings to encourage people to stay on your site for longer, it’s also good for conversions, as you can get your message across multiple times in the same post. This should help to persuade the people who are reading the post that your message is a worthwhile one, which is obviously the whole point of saying something in a blog post in the first place.

My recommendation is to aim for posts of 800+ words. So a post of 1000 words – such as this one – is not only a good bet, but it also appeals to my liking for rounded figures. (1000 words thus being a more pleasing length than 800!).

Even better still is a post of greater than 1000 words. (The entry level for my ongoing search engine optimisation service actually includes an 800-2000 word blog post each month, alongside Links Outreach, which is outside the scope of this particular article, but is no doubt something I’ll be returning to in the future).

There have been several studies of Google’s top ten rankings (though I’ll be returning to the idea of whether there even is such a thing as a “top ten ranking” at a later stage!) that indicate blog posts of 800-2000 words dominate the results pages. And my own research does back up these findings a little. I wouldn’t go so far as to say it is essential to write posts of this length, though, as I have also been responsible for creating many blog posts of 1000 words – and sometimes less – that enjoy similar rankings success.

To sum up, my suggestion is to write 800-2000 word blog posts wherever possible – both for SEO purposes and to drive your message home more forcefully than a shorter post could. But if you can only think of enough to say to fill a 300-400 word post, that’s fine, too. As there’s nothing more likely to make your site visitors bounce straight away from your page than to fill it with padding and waffle. (Another sure way to get Google to downgrade your page in its listings, too).

So to revise the well-known phrase “content is King”, I’d add in an extra word, which is essentially what I’m promoting here – “quality content is King”. If you can write 800-2000 words of high quality content, your blog posts will always achieve their goals.

Facebook Advertising Boom Time

Facebook Advertising Boom Time

If you’re a business owner, you’ve no doubt dabbled in Google AdWords and maybe even old fashioned banner advertising.

And if your experience is like most people’s, you’ve either decided it isn’t for you or slashed the amount you spend on it as you just can’t seem to make it work. (That is, if you aren’t using my PPC services, of course!).

But you may not have come across the concept of Facebook advertising. And if you have, you’ve probably found it a bewildering experience with all the different options involved, so have probably left it to one side for the time being while you concentrate on other ways to promote your business.

Well that would be a shame. As Facebook advertising is “the new gold rush” when it comes to driving quality, targeted visitors to your site – similar to the way Google AdWords operated 10 years or so ago.

With extremely affordable traffic and a fantastic variety of targeting options, you need to be investigating Facebook advertising now, before your competitors catch on to how successful it can be.

Check out my Facebook advertising service for more info and to see how you can benefit from this latest promotional goldmine.

 

Social Media Customer Feedback

Social Media Customer Feedback

In The Times today (always a good source of blog post material – see this post on Getting Ideas for Blog Posts for more), there was a pullout Raconteur section about “Brand & Reputation”.

As well as many other elements of promoting your brand and your organisation’s reputation online, the main thrust was to ensure you listen to your customers and act on what they’re saying, in order to improve your business and keep customers happy.

Now, in some senses, I agree that there needs to some monitoring of what people are saying about you online. You don’t, for instance, want to ignore a groundswell of opinion that could be damaging to the potential for attracting new customers – eg if people are scathing about your services or disappointed with your products somehow.

However, the idea that you need to be constantly scouring Twitter and Facebook to refute anything you think could be detrimental is something that’s really only come from a group of people calling themselves “online reputation managers” or similar. The facts in the Raconteur pullout speak for themselves – “..less than 10% of brand conversations happen online. The majority of conversations about brands continue to happen in the real world, just as they always have…” Hardly a ringing endorsement for the idea that social media is the be all and end all of promoting your brand – and remember, this is in a pullout aimed at promoting the idea of Brand and Reputation management.

So my own suggestion is that you should certainly pay attention to what people are saying – and definitely you should take the time to respond to direct questions etc on social media within a reasonable time frame (ie not just once a week) – but don’t obsess about it, taking you away from potentially more valuable activities that can help promote your brand through word of mouth – just like in the “olden days”!

Google Mobile Search Algorithm Update

Google Mobile Search Algorithm Update

Google recently announced a change to its mobile search algorithm that affects ALL websites, not just mobile sites:

http://googlewebmastercentral.blogspot.co.uk/2015/02/finding-more-mobile-friendly-search.html

Essentially, what it means is that, from April 21st this year, Google will be downgrading sites in the search results of mobile devices (ie anyone searching Google on a mobile phone), if they are not “mobile friendly”. And, consequentially, upgrading sites in the search results of mobile devices that are “mobile friendly”.

So if your site isn’t suitable – in Google’s opinion – to be viewed on a mobile device, you will have no chance of your site appearing in Google’s search results on mobile devices. Whilst you may think this isn’t so important for you, as most of your traffic comes from conventional desktop / laptop machines, you should bear in mind that the definition of a “mobile device” will include tablets – something that more and more people are using for their general access to the internet. Plus, accessing the internet on a mobile phone is an increasing share of all internet traffic, so you will be leaving a large proportion of internet users out of your potential market if you don’t have a “mobile friendly site”.

How do I Know if my Site is Mobile Friendly?

Google have provided an online tool for checking your site to determine how mobile friendly it is:

https://www.google.com/webmasters/tools/mobile-friendly/

They have also produced a set of guidelines that you should understand in order to make sure your site passes their test:

https://developers.google.com/webmasters/mobile-sites/mobile-seo/

As well as providing for different screen sizes, it should be noted that their guidelines include such issues as ensuring the clickable links are not too close together.

Mobile Version of Site or Responsive Design?

There are currently 2 options for ensuring your site is mobile friendly:

1) Set up a separate mobile version of the site, with device recognition capability so the correct version of the site is sent to the correct device. (ie people viewing on a desktop PC will see the “normal” version of the site, people viewing on a tablet or a mobile phone will see the “mobile” version of the site).

2) Make the site “responsive”, such that it changes design and layout based on the device being used to view it. (The page you’re currently reading now is “responsive”, as it adjusts itself to fit the screen size you’re using, rather than sending you to a separate mobile version).

My Recommendation

Whilst Google is currently saying it is fine to have 2 versions of your site aimed at different devices, my recommendation is that you aim for your site being responsive, rather than having different versions.

My reasoning is based on years of experience with Google and the fact that they are generally opposed to having different versions of the same site in their index.

Certainly, if you don’t wish to make your site fully responsive just yet – for reasons of eg budget, future redesign schedule – then you should be OK in the short term to simply have a mobile version of the site that works alongside the main site.

However, for long term peace of mind, I’d definitely recommend making your site responsive, in order to pre-empt the likely negative effect on all  rankings (mobile and non-mobile) that I predict for the future for non-responsive sites.

One thing to bear in mind here is that, not only will your site need to adjust its layout to fit the smaller device screens, it should also make sense from a marketing design point of view – ie you should ensure your name, contact details, main sales message etc are visible on the home screen for anyone viewing your site, rather than being shunted away as a result of the responsive design parameters.

Get in touch to see how I can help optimise your website in line with the updated algorithm – Contact

Update your Blog Frequently

Update your Blog Frequently

One of the key issues with generating interest from your content is to ensure you write frequently. I’m not suggesting you need to post a new article everyday – though this would be a great thing to be doing, depending on quality of course – but at least once a week and preferably 2-3 times a week is a great foundation for building up content on your site. (And yes, I’m aware I don’t always follow this rule on my own blog!).

Check out my earlier series of posts on generating blog content ideas for where to get the inspiration for your frequent posts.

There are 3 main reasons you should update your blog frequently:

1) It makes the blog – and thus the company – look active and dynamic, thus helping site visitors warm to you and your services.

2) It helps Google to identify your blog as being active and dynamic, so encouraging them to visit more often and potentially give your site better rankings for multiple phrases.

3) It gives you more content that can be promoted to others through link outreach – generating more inbound links and more interest from around the internet in what you have to say.

As mentioned above, the quality of the content is obviously very important. But the simple rule for content development is quality + frequency = provides a firm foundation for content marketing success.

Write with One Person in Mind

Write with One Person in Mind

When you’re writing your blog posts or things like email newsletters, a great little tip is to “write with one person in mind”.

You can approach this 2 ways:

1) Write with a specific person in mind

This could be someone you know – your mum, sister, friend, boss etc – or someone you simply know of – the industry leading CEO, local supermarket manager, George Clooney etc. Just make sure you picture them reacting to what you’re saying, and keep the blog personal and interesting as a result.

2) Write as though to a single person who represents a type

This is generally a more common approach, with companies having determined who their target audience is likely to be, then devising a persona that they can try to talk to as a result. If you adopt this method, make sure you put yourself in their shoes and try to come up with questions that they’d be asking if you were chatting to them in the bar after work.

The idea behind writing to one person is that you come across as more personable and approachable, thus creating more engaging and readable posts that don’t put people off for being too corporate.

Email – the Internet’s Poor Relation?

Email – the Internet’s Poor Relation?

With all the fuss about social media over the last few years, you’d think the original posterboy for the internet – email – would be languishing in poverty somewhere, cradling a bottle of gin and bleating about how good everything used to be for it in the old days.

However, far from it. Email is still one of the most useful and effective means of communicating with your customers and potential customers that you can use – despite what the Social Media Marketers might want to have you believe!

Certainly people spend more time on Facebook and the like nowadays, with inboxes often being regarded as only attracting viagra spam and the like. But actually, there is more business done and more effective communications made via email than there ever could be via social media. Sure, social media is great for instant communication and building up an audience, but my experience suggests that 90%+ of people who become clients will contact you first by email, with quite a few still even using the phone.

So don’t be too quick to write off the value of the internet’s forgotten marketing tool – email is here to stay.