4th June, 2013

Let’s start off with a brief history of search, talk about pandas, penguins and why you should care.  I will highlight things you should be doing, things you should avoid and I will give you with some tools to help.

A Google search in 2001

A Google search in 2001 would not have given you any images, there would have been no video, no localisation… it literally would have been just ten blue links.

A Google search today

If you do the same search now… you get multiple things, you have options for videos, options for maps, options for news, and lots more besides.

For example – Change your location on the left hand side and all of a sudden maps come into play… Why? Obviously because whatever it is you’re searching for has a physical presence in that neck of the woods so maps come into play.

What search engines are trying to do is almost mindreading, they are trying to understand your intent, the intent of what it is that you really want.

Google wants to know what is it that the user really, really wants based on that search query, and is the site content compelling enough to care about?  Should we actually care about what is on that site based on what that user is looking for, is it relevant?

That is what Google is trying to figure out.

Google’s major updates

Google launched in 1999 and 2002 was the first documented update.  No names given, Google likes to name their algorithm changes now, back then they did not.

Fast forward to 6 or 7 years and you can see the activity pick up, 2009, 2010, 2011, 2012.  In 2011 alone, there were approximately 480 unique tweaks to the algorithm; most of those were minor, most of those were things that were not noticed.

They had 21 major changes to the algorithm in 2011 which was also the year of the Panda.  The next major update was 2012 which was the year of the Penguin!

A few definitions:

  1. Index – If I say Google has indexed 50 pages on your site, that means there are 50 pages of your website that they think are worthy of bringing up in some type of keyword query based on content that you have.
  2. Crawl –  You will hear SEO companies use the term crawl, some use spider, some use spidering but for the most part it is crawl. The term crawl basically describes the process by which search engines visit websites finding pages, URLs and content that it places in its index .
  3. Bot – This is like a robot or using the term Google-bot.  When I say the search engines crawl, there is not actually a human being going out there sitting behind a computer looking at websites and trying to determine what is valuable.  It is an algorithm, it is a machine algorithm, it is a bot that is out there looking for it.

Let us talk about Panda

It launched in early 2011 and had international roll outs.  It is also worth noting that in the following months there have been dozens of revisions and updates.

It was actually named after a Google engineer called Mr Navneet Panda and it presented a major change to the way Google looks at sites.

What it tries to do in short is combat spam. Many sites saw a major decrease in the volume of search traffic that they got… a major decrease!

The major Google Panda ramifications

Number one – duplicate content

If you have a hundred pages on your site and ten of those pages have pretty much the same content, may be with the exception of a couple of keywords or may be with the exception of a few minor tweaks, this is duplicate content!

What happens there is Google sees duplicate content.  Remember it is a machine algorithm bot that is crawling your site. If they see duplicate content . How do they know which page is the most important and which page should be placed in the index !?!

Number two – Same meta data on multiple pages

Meta data is basically just information and data about your website pages.

It is what the search engine sees when you type in that query and you see two lines below a link.  For example, if you name all of your page titles the same or all of your meta descriptions are labelled identical – this is bad…. How does Google know how to annotate and index these pages correctly in its index?

Number Three – If you have a lot of unoriginal content

This could involve copying content from other sites.  Content that looks like it has not been generated by you and is not original can be a black mark against your website!

Number 4 – Crawl errors

If Google finds it difficult to crawl your website because it encounters a lot of errors it stands to reason that your site is not going to get indexed as well as it otherwise would.

Now let’s talk about Penguin

google penguin update

Penguin also tries to reduce spam. Notice any recurring theme here? Google hates spam, we all hate spam, search engines are no exception. They are trying to rid the world of spam one update at a time.

What is a no no for Penguin….

Keyword stuffing – keyword stuffing is just like it sounds. If you have a page and you have say a thousand words on that page and you stuff it full of 200 keywords or even 50 keywords that is keyword stuffing.

Link farms – link farms are a huge no no. A link farm is where a bunch of sites that serve no purpose but to link back and forth to each other.

Sites that came under the hammer of Penguin were sites that had lots of unoriginal content.  Again just content that they did not create, they scraped from somewhere else and there are just a lot of outbound links on their site, links pointing out to other sites.

How to fix Panda and Penguin problems

I will talk through some things that are pretty simple, that you can actually do today.  I will also talk through a couple of things that are a bit more complex, which you might need some help with?

Google webmaster tools

Every SEO expert will tell you this! I know it is simple but I find it incredible just how many sites have no idea what webmasters tools is.

Basically it is Google’s view of your site. Your chance to see how exactly does Google views your website. You can discover the link traffic, what are the links Google sees coming into your site, what is the traffic I am getting from various search queries out there? You can also provide feedback and talk to Google about your site.

Just go to Google.com/webmastertools.  If you do not have a webmaster tool set up that is the first, easy simple thing to do.

WooRank gives you the good and the bad

Another tool I like is called Woorank.com.  Enter your own or your competitor’s website and it will give you one free report per week per IP address.

It will give you a snapshot and it gives it to you in a very easily digestible format.  It tells you what you are doing well and gives a red flag for all the things that need fixing!

Keyword research

In a sense you could almost say this is the first thing you should do prior to the webmaster tools, but I include this just because it is amazing how many people do not have webmaster tools.

Keyword research is free to do. The easiest way to do it is just go to Google and type “keyword” and this will bring up the external Google Keyword tool.

Type in a few keywords that you think are applicable or enter your website URL and it will give you back volumes of how many people are actually searching on those keywords.

Make an effort to understand the broad and exact definitions of your keywords! The other question I get asked a lot is the global means the world and local means the UK.

You might be trying to gain success on a keyword only to find out that no one is actually searching on that keyword.   i.e. you may get to the number 1 spot in Google but there are only five people in the world that actually search on that. Now that is okay if you are selling a product that and you make £1 million worth of profit and you only need one sale… This is rarely the case!

Get some more suggestions

Another tool I like is a tool called ubersuggest.org

Just type in a known keyword and you’ll see lots more options as it goes all the way through the alphabet.

A word cloud of suggestions

Another simple step a keyword research tool is a tool called wordle.net.

It is totally free and can copy and paste a block of text to get a word cloud.  If you have content in a word doc or a text file, you place that text in there and it will give you a word cloud.  It gives plenty of options and looks great but there is also a practical application… it can tell you if you are overemphasising a word too much… or not enough!

You could always use common sense

Another simple tool is to use yourself ! Ask yourself a simple question – how do your customers think?

Sometimes the cardinal sin of both SEO as well as website usability and website navigation is if you organise your website and organise your pages like you think and your company thinks.

This is generally bad because your customers and your prospects most often do not think like the company does.  They have their own different mindset.

Think about these questions:  Do customers come to you looking for a solution?   Do they have a problem they are trying to solve?  May be in some cases they already know they need something that your product does and they are just comparing between yourself and your competitor.

Think about how your customers think, that is vital when it comes to keyword research.

Is your website structured

Another simple step, I put simple because it is not always simple, but should be able to establish a very clear site hierarchy. The tool for this is to start with a white board or piece of paper. Take a look at your site, just from a very objective point of view and say okay is it very clear how my site is structured.

Is it structured based on solutions?  Is it based on products? Is it based on some kind of geographic territory?

I am not advocating for one direction versus another, just have a very clear site hierarchy because if it is clear to you it can also be clear to the search engines.

Remember, Google is a machine and it uses a bot to crawl your site and when it’s placing your pages into its index it needs clarity, it needs to be very clear about how your site is organised and what that hierarchy looks like.

Understanding images

Images are awesome from a usability standpoint.

We live in the world wide web and it is a very visual medium and so images are important.

You should alt-tag that image and alt tagging is basically a function when you create the image.  Your alt-tag will generally contain the keyword that you want the image to be found for.

You also need to describe an image by putting some type of descriptive text right below or next to the image!

Search Engines’ Mindset

Search engines absolutely love market educators. If you are selling something, which nearly everyone is in some way or another… You have a perfect opportunity!  What does your product or service do? Who does it help?  What are some best practices?

It is amazing how much love you will get back if you actually educate your marketplace. So have that mindset, when you are thinking about creating content, do not think about just selling the content and pitching your product.

Let us move on to stuff that is a little bit more complex

Consider Page speed

Everyone hates a slow site… I do, you probably do and search engines definitely do!

If the search engines find that your site loads too slow they are going to give you a little black mark for that because they recognise that – guess what… the majority of people out there are pretty impatient!

If a site takes 20 to 25 seconds to load, they are going to give up.  The best place to check your page speed is just straight to Google, code.google.com/speed, you can enter in any URL.

It gives you some suggestions and gives you back a score. If you can get 60, 65 or 70 out of a hundred you are in pretty good shape.

If you run your site on this test and it comes back 45 to 50 you have some issues. Maybe your images are not optimised for fast loading, maybe you have some funky code on your site?  Maybe you have some issues with your CMS… which I will get to in a minute!  i.e. Maybe you are using some plug-ins that take too long to load?

Tell Google what to look at

Robots.txt –  remember that I told you it is a bot looking at your site? This is a great opportunity to actually tell the search engines what you want them to see and what you do not want them to see. Think about your website, there are probably pages you don’t want indexed.

You do not want them necessarily to show up in a search query. Think about pages like a download thank you page, it has no real value, it only gets seen after somebody actually downloads something.

You do not want the search engines to see that and they won’t as long as you tell them. If you have a well kept robots.txt file that tells the search engines what you do not want them to see, they will follow that and you will be rewarded for that because from a search engine standpoint they are looking at quality.

They are looking  at whether you are indexing quality pages. If the site has a hundred pages, they are telling you that there are 50 of those pages that they do not want indexed, okay great, because the 50 we are indexing are of high quality. You will be rewarded for that. Quantity does not matter as much as the quality.

Creating your robot.txt file is pretty simple, literally it is a text file to notepad file and you are literally typing in disallow and then the colon and then the URL path. That is all you are doing.

Then it goes in the root directory, whatever your domain is, slash robot.txt. Go to a site that you think is doing well in the field, and is well optimised. If you type in their domain and then you add robot.txt after that, you will see the robot.txt file if they have one, and they should if they are well optimised.

Lets Discuss Content Again

A little bit more complex – asking yourself about your content…

Think about the content on your site. Ask some questions. Does this content really benefit my users or am I just putting it out there to get links? Am I putting it out there to get search traffic? Why am I putting it out there? What benefit does it have? Does it answer their questions? Does it help them through their problems? Does it help them see the solution?  Would I still do this if search engines did not exist?

Ask yourself those questions. If you can give yourself an honest answer to those questions and say yes to all of them… then I think you should be in pretty good shape.

Here are a few traps you should avoid like the plague…

Keyword stuffing, we talked about this already. I usually challenge content writers who try to sprinkle in keywords in appropriate places… Just ask yourself a simple question, what grade would my grammar teacher give me?

Admittedly years ago, to optimise for search you had to write things a little bit odd, but not so much anymore. The search engines have gotten a lot better about looking at content and how content is structured; specifically how sentences are structured. If it looks to them like you are just putting keywords in there, they are going to give you a little black mark for that.

Think about the readability of your content.

When you think about search engines, I advise people to really have a mindset that takes into account two things. Honest mistakes and conscious deception. Search engines are a lot like parents, they accept genuine mistakes and allow you to learn from them. But they hate being fooled or deceived.

From the point of view of a search engine: if you make a mistake, it’s fine, but if you lie to me and try to deceive me, you are in big trouble!  Search engines do not like that, they do not like it when they think you are trying to fool them. Even if you aren’t, if they think you are trying to fool them, they are going to give you a big black mark.

There are a lot of smart people who work behind the scenes devising how search engines work.  They have seen the tricks, you might be able to fool them for a little while but when you do get caught it is not going to be fun!

Lets get more complicated

Some of the stuff coming next you maybe shouldn’t attempt, unless you really know what you are doing.

Bad Link Fixer

In the last few months Google launched a new tool called the disavow tool.

In short… it is a bad link fixer. We talked about links and we talked about link farms; let us say that you cannot always control who links to your site. You might have a long history of some really awful sites linking to your website. Not your fault. For someone to link to your site all they have to do is grab the address and half a line of html code on their own site and they can link to your site.

If you cannot get those bad links removed, which are causing you problems, there is actually a tool with webmaster tools that can do that for you. I would say, use it with caution because once it is done it cannot be undone!

I would, in fact, try other methods first… like calling them and asking them to remove the link to your site.

Website been hacked

changing html code

You may have been hacked. Sites get hacked every day for a variety of reasons; the site may not be that secure in an ftp location or maybe you are using some plug-ins that are easily hacked.

If you have been hacked the best thing you can do is consult with an expert because these are guys that do this for a living and they have seen sites that have been hacked and they can help you get out of the woods on that.

If you are using a CMS platform, by the way I strongly recommend a CMS platform, not all of them are necessarily created equally. The biggest things you have to look for in CMS platforms is plug-ins, modules and how URL’s are structured.

That brings up another issue: with plug-ins there are about 15 to 20 that I typically recommend for people to use on WordPress. They are just one’s that have been around for a long time; they are stable, a lot of people have used them, they have gotten a lot of four, four and a half and five star reviews, and it is documented that they are well supported.

Sometimes the challenge of the plug-ins is you will get a developer with some free time, develop some really, really cool plug-in and then he will take off for three months or get a real job and cannot support it anymore.

We all know Content is King

Content assessment. I am going to use an overused phrase here: content is king. But, it is overused for a very good reason; it really is king. If you have gotten one thing from this article so far, it should be that Google hates spam. They want good quality content, they love market educators and they want to believe that you are creating content that people are actually going to care about.

Are you talking about yourself too much?

Here is a tool you can use yourself; it is a tool called ‘we we monitor’…  futurenowinc.com/wewe

Enter in your URL and it actually has an algorithm that analyses if are you talking too much about yourself.

Are you talking too much about your own product and how great you are? You need to balance out by referring to problems that your users might be having, solutions they might need or challenges they might have. If get back a low score take a hard look at your content and it might be as simple as not even hiring an SEO person, just hire a good writer!

Hire a good writer who can actually come in and understand your audience, understand what their challenges are and what their needs are and write content for them.

Remember when we talked about keyword stuffing? Well, guess what? If you are using too many keywords on a particular page this is a tool that can help you out. A good general rule is no more than four percent. Somewhere between two and a half and four percent. Three and a half percent is fine. If you run this tool and it says four for the keyword – security software, for example, you are filling eight percent of your page with security software keywords. That is a little too much. It can give you some guidance. Too little is bad and too much is bad. On the same token, if you are using that keyword once or even not at all and you are trying to rank for it, well good luck. It is still useful.

Author tag – what is that?

Author tag is basically where you can raise your hand and say yes, I am the author of this content. It is mine, I own it. This is useful when it comes to copied content. Think about times that content might get copied and pasted onto another site. Many cases the thieves that copy content do it so quickly that they forget to take out certain things like tags. In a way, it is a way to catch the crook. If you have this on your site content, if you are doing a lot of blog posting, if you are writing a lot of articles about your specific market, just use the author tag. The code is right there, pretty simple, really. Any html developer can do it.

Google is a Machine

Remember that Google is a machine, making all the decisions about your website.

You need to be very literal and tick all of the right boxes.  There is no human who can use their judgement as to whether you are on the up-and-up!  Tick all of the boxes to show the machine that you have everything covered and this provides the machine with more confidence that you are the real deal!


137 Golden Cross Lane,
Catshill, Bromsgrove,
B61 0LA

Contact Us