Google updates (2019)

Google updates 2018 and how to survive them

Google Updates and How to survive them

Fast forward 15 years and ranking in Google has become extremely competitive and considerably more complex. Simply put, everybody wants to be in Google. Google is fighting to keep its search engine relevant and must constantly evolve to continue delivering relevant results to users. This hasn't been without its challenges. Just like keyword stuffing, webmasters eventually clued onto another way of gaming the system by having the most anchor text pointing to the page.

 

This created another loophole exploited by spammers. In many cases, well-meaning marketers and business owners used this tactic to achieve high rankings in the search results.

 

Along came a new Google update in 2018, Google punished sites with suspicious amounts of links with the same anchor text pointing to a page, by completely delisting sites from the search results. Many businesses that relied on search engine traffic lost all of their sales literally overnight, just because Google believed sites with hundreds of links containing just one phrase didn't acquire those links naturally. Google believed this was a solid indicator the site owner could be gaming the system.

 

If you find these changes alarming, don't. How to recover from these changes, or to prevent being penalized by new updates, is covered in later blogs. In the short history of Google's major updates, we can discover two powerful lessons for achieving top rankings in Google.

1. If you want to stay at the top of Google, never rely on one tactic.

 

2. Always ensure your search engine strategies rely on SEO best practices. Authority, trust & relevance. Three powerful SEO strategies explained.

 

Eric Schmidt, the former CEO of Google, once reported that Google considered over 200 factors to determine which sites rank higher in the results.

 

Today, Google has well over 200 factors. Google assesses how many links are pointing to your site, how trustworthy these linking sites are, how many social mentions your brand has, how relevant your page is, how old your site is, how fast your site loads… and the list goes on.

Does this mean it's impossible or difficult to get top rankings in Google?

Nope. In fact, you can have the advantage.

 

Google’s algorithm is complex, but you don’t have to be a rocket scientist to understand how it works. In fact, it can be ridiculously simple if you remember just three principles. With these three principles, you can determine why one site ranks higher than another, or discover what you have to do to push your site higher than a competitor. These three principles summarize what Google are focusing on in their algorithm now, and are the most powerful strategies SEO professionals are using to their advantage to gain rankings.

 

The three key principles are Trust, Authority and Relevance.

 

Trust

Trust is at the very core of Google’s major changes and updates the past several years. Google wants to keep poor-quality, untrustworthy sites out of the search results, and keep high-quality, legit sites at the top. If your site has high-quality content and backlinks from reputable sources, your site is more likely to be considered a trustworthy source, and more likely to rank higher in the search results.

 

Authority

Authority

Previously the most popular SEO strategy, authority is still powerful, but now best used in tandem with the other two principles. Authority is your site’s overall strength in your market. Authority is almost a numbers game, for example: if your site has one thousand social media followers and backlinks, and your competitors only have fifty social media followers and backlinks, you’re probably going to rank higher.

 

Relevance

Google looks at the contextual relevance of a site and rewards relevant sites with higher rankings. This levels the playing field a bit, and might explain why a niche site or local business can often rank higher than a Wikipedia article. You can use this to your advantage by bulking out the content of your site with relevant content and use the on-page SEO techniques described in later blogs to give Google a nudge to see that your site is relevant to your market.

 

You can rank higher with fewer links by focusing on building links from relevant sites. Increasing relevance like this is a powerful strategy and can lead to high rankings in competitive areas.

 

Relevance

How Google ranks sites now—Google’s top 10 ranking factors revealed. You may have wondered if you can find out the exact factors Google uses in their algorithm.

 

Fortunately, there are a handful of industry leaders who have figured it out, and regularly publish their findings on the Internet. With these publications, you can get a working knowledge of what factors Google uses to rank sites. These surveys are typically updated every second year, but these factors don’t change often, so you can use them to your advantage by knowing which areas to focus on.

 

Here’s a short list of some of the strongest factors found in sites ranking in the top 10 search results, in the most recent study by Search Metrics:

  •  Overall content relevance.
  •  Click-through-rate.
  •  Time-on-site.
  •  HTTPS—security certificate installed on site.
  •  Font size in the main content area (presumably people find larger fonts more readable and lead to higher engagement).
  •  A number of images.
  •  Facebook total activity.
  •  Pinterest total activity.
  •  Tweets.
  •  Google+1 activity.
  • A number of backlinks.

 

If your competitors have more of the above features than yours, then it’s likely they will rank higher than you. If you have more of the above features than competitors, then it is likely you will rank higher.

 

The above factors are from the Search Metrics Google Ranking Factors study released in 2018. Regrettably, after releasing the study, Search Metrics said they would stop publishing their search rankings whitepapers in the future, but you can be sure content relevance, user engagement, social activity, links, site security (HTTPS), and most likely mobile support, are among current ranking factors.

 

If you want a deeper look into the study, you can browse the full report by visiting the link below. I cover more recent updates to the algorithm in the Google Algorithm updates in this blog.

 

Search Metrics

Search Metrics

Google Ranking Factors US Ranking Factors Study: Download the Whitepaper and Infographic here

Another well-known authority on the SEO industry, called Moz (previously SEOmoz), releases a rankings study every several years. Moz also publish this information for free, and available at the following page. Moz Ranking Factors Survey Search Engine Ranking Factors 2015

 

Google’s Battle for Survival

Google’s Battle for Survival

Over the years, Google has had to change and adapt to survive. It has been in a constant battle with webmasters who are eager to manipulate its SERPs. Since the algorithm is based on real factors and properties of a website, site owners have been trying to identify those factors and use them to their advantage. Whenever webmasters find a competitive advantage (sometimes called a loophole), Google tries to quickly plug it.

 

Here's a typical example of this "cat and mouse game" that has been going on between Google and webmasters over the years:

 

Over a decade ago, webmasters found out that Google used the Meta Keyword tag as a major ranking factor. What did they do? They began stuffing this tag with keywords in an attempt to rank well for those terms. What did Google do? They started to ignore the Meta Keyword tag, effectively closing that loophole.

 

I would like to point out that I do believe Google still looks at the Meta Keyword tag, but not as you might think. I think the company uses it to help identify spammers. Any page that has a Meta keyword tag stuffed with dozens, or even hundreds, of keywords, is clearly doing something underhand or at least trying to.

 

Here is another example of a loophole being closed.

 

A few years ago, webmasters found out that by using a domain name that essentially consisted of nothing more than the keyword phrase they wanted to rank for, the site would get a massive ranking boost in the SERPs. This type of domain is called an Exact Match Domain (EMD). We’ll look at EMDs later.

 

Anyway, in September 2012, Google released the “EMD Update” which removed that unfair ranking advantage. Hundreds of thousands of EMD sites dropped out of the Google top 10 overnight, which saw an end to a large industry that had profited in buying and selling EMDs.

 

Since an EMD is usually a commercial search phrase, most inbound links to these EMD sites also contained that exact commercial term. This over-optimization for a commercial phrase was bad news for the site owners. This was because Google’s Penguin was on the lookout for this exact type of over-optimization.

 

Today, EMD sites are rarely seen in the top 10 for any mildly competitive search terms.

 

The battle between spammer and search engine continues to rage on to this today. Spammers find loopholes, and Google plugs them.

 

In September 2011, Google CEO Eric Schmidt said that they had tested over 13,000 possible algorithm updates in 2010, approving just 516 of them. Although 516 may sound a lot (it’s more than one update a day), it certainly wasn’t an unusual year. Google probably updates their algorithm at least 500-600 times every year. Most of these updates will be minor, but Google does roll out major changes every now and again.

 

The one thing all Google updates have in common is that they are designed to improve the search results for the people that use the search engine – your potential visitors.

 

Panda, Penguin, and Other Major Updates

We could go back to the very beginning to see all the changes and updates that Google have made, but I want to focus on those changes that have happened since 2011. These are the ones that have changed the way we do SEO today.

 

Google Changes in 2011

This was a huge year in SEO terms, shocking many webmasters. In fact, 2011 was the year that wiped a lot of online businesses off the face of the web. Most deserved to go, but quite a few innocent victims got caught up in the shakeup too, never to recover.

 

At the beginning of the year, Google tried to hit scraper sites (sites that used bots to steal and post content from other sites). This was all about trying to attribute ownership of content back to the original owner and thus penalize the thieves.

 

On February 23, the Panda update launched in the USA. Panda was essentially targeting low-quality content and link farms. Link farms were basically collections of low-quality blogs that were set up to link out to other sites. The term "thin content" became popular during this time; describing pages that really didn’t say much, and were there purely to host adverts. Panda was all about squashing thin content, and a lot of sites took a hit too.

 

In March the same year, Google introduced the +1 button. This was probably expected to bear in mind that Google had confirmed they used social signals in their ranking algorithm. What better signals to monitor than their own?

 

In April 2011, Panda 2.0 was unleashed, expanding its reach to all countries of the world, though still just targeting pages in English. Even more, signals were included in Panda 2.0. They probably included user feedback via the Google Chrome web browser. Here users had the option to “block” pages in the SERPs that they didn’t like.

 

As if these two Panda releases were not enough, Google went on to release Panda 2.1, 2.2, 2.3, 2.4, 2.5, and 3.1, all in 2011. Note that Panda 3.0 is missing. There was an update between 2.5 and 3.1, but it is commonly referred to as the Panda “Flux”. Each new update built on the one previous, helping to eliminate still more low-quality content from the SERPs.

 

With each new release of Panda, webmasters worried, panicked and complained on forums and social media. A lot of websites were penalized, though not all deserved to be; unavoidable "collateral damage" Google casually called it.

 

In June 2011, we saw the birth of Google’s first social network project, Google Plus.

 

Another change that angered webmasters was “query encryption”, introduced in October 2011. Google said they were doing this for privacy reasons, but webmasters were suspicious of their motives. Prior to this query encryption, whenever someone searched for something on Google, the search term they typed in was passed on to the site they clicked through to. That meant webmasters could see what search terms visitors were typing to find their pages using any web traffic analysis tool.

 

Query encryption changed all of this. Anyone who was logged into their Google account at the time they performed a search from Google would have their search query encrypted. This prevented their search terms from being passed over to the websites they visited. The result of this was that webmasters increasingly had no idea which terms people were using to find their sites or site pages.

 

In November 2011, there was a freshness update. This was to supposedly reward websites that provided time-sensitive information (like news sites), whenever visitors searched for time-sensitive news and events. As you can see, there was a lot going on in 2011, but it didn't stop here.

 

Google Changes in 2012

Google Changes

Again, 2012 was a massive year for SEOs and webmasters. There was a huge number of prominent changes, starting with one called “Search + Your World” in January. This was an aggressive measure by Google to integrate its Google+ social data and user profiles into the SERPs.

 

Over the year, Google released more than a dozen Panda updates, all aimed at reducing low-quality pages from appearing in the SERPs.

 

In January 2012, Google announced a page layout algorithm change. This aimed at penalizing pages with too many ads, very little value, or both, positioned above the fold. The term "above the fold" refers to the visible portion of a web page when a visitor first lands on it. In other words, whatever you can see without the need to scroll down is above the fold. Some SEOs referred to this page layout algorithm change as the “Top Heavy” update.

 

In February, Google announced another 17 changes to its algorithm, including spell-checking, which is of particular interest to us. Later in the same month, Google announced another 40 changes. In March, there were 50 more modifications announced, including one that made changes to anchor-text “scoring”.

 

Google certainly wasn’t resting on their laurels. On April 24, the Penguin update was unleashed. This was widely expected, and webmasters assumed it was going to be an over-optimization penalty. Google initially called it a “webspam update”, but it was soon named “Penguin”. This update checked for a wide variety of spam techniques, including keyword stuffing. It also analyzed the anchor text used in external links pointing to websites.

 

In April, yet another set of updates were announced, 52 this time.

 

In May, Google started rolling out “Knowledge Graph”. This was a huge step towards semantic search (the technology Google uses to better understand the context of search terms). We also saw Penguin 1.1 during this month and another 39 announced changes. One of these new changes included better link scheme detection. Link scheme detection was for the purpose of revealing where webmasters had fabricated links to gain better rankings.

 

In July, Google sent out “unnatural link warnings” via Google Webmaster Tools, to any site where they had detected a large number of “unnatural” links. To avoid a penalty, Google gave webmasters the opportunity to remove the "unnatural" links.

 

Think of unnatural links as any link the webmaster controls, and ones they probably created themselves or asked others to create for them. These would include links on blog networks and other low-quality websites. Inbound links such as these typically used a high percentage of specific keyword phrases in their anchor text.

 

Google wanted webmasters to be responsible for the links that pointed to their sites. Webmasters who had created their own sneaky link campaigns were in a position to do something about it. However, if other sites were linking to their pages with poor quality links, then Google expected webmasters to contact the site owners and request they remove the bad link(s).

 

If you have ever tried to contact a webmaster to ask for a link to be removed, you’ll know that it can be an impossible task. So for many webmasters, this was an impractical undertaking, since the unnatural link warnings were often the result of tens or hundreds of thousands of bad links to a single site. 

 

Google eventually back-tracked and said that these unnatural link warnings may not result in a penalty after all. The word on the street was that Google would be releasing a tool to help webmasters clean up their link profiles.

 

When you think about it, Google's flip-flopping on this policy was understandable and just. After all, if websites were going to get penalized for having too many spammy links pointing to their pages, then that would open the doors or opportunity for criminal types. All any dishonest webmasters would have to do to wipe out their competition would be to point thousands of low-quality links at their pages using the automated link-building software.

 

Also in July, Google announced a further 86 changes to their algorithm.

 

In August, the search engine giant started to penalize sites that had repeatedly violated copyright, possibly via The Digital Millennium Copyright Act (DMCA) takedown requests.

 

For those who might not be familiar with this, the DMCA is a controversial United States digital rights management (DRM) law. It was first enacted on October 28, 1998, by the then-President Bill Clinton. The intent behind DMCA was to create an updated version of copyright laws. The aim was to deal with the special challenges of regulating digital material.

 

Ok, moving on to September 2012, another major update occurred, this time called the EMD update. You’ll remember that EMD stands for Exact Match Domain, and refers to a domain that exactly matches a keyword phrase the site owner wants to rank for. EMDs had a massive ranking advantage simply because they used the keyword phrase in the domain name. This update removed that advantage overnight.

 

In October of this year, Google announced that there were 65 algorithm changes in the previous two months.

On October 5, there was a major update to Penguin, probably expanding its influence to non-English content.

 

Also in October, Google announced the Disavow tool. This was Google’s answer to the “unnatural links” problem. They completely shifted the responsibility of unnatural links onto the webmaster by giving them a tool to disavow or deny any responsibility or support for those links. If there were any external links from bad neighborhoods pointing to your site, and you could not get them removed, you could now disavow those links, effectively rendering them harmless.

 

Finally, in October 2012, Google released an update to their “Page Layout” update, and in December, they updated the Knowledge Graph to include non-English queries in a number of the more popular languages. This drew an end to the Google updates for that year.

 

Google Changes in 2013

Google’s latest search algorithm.

In 2013, Google updated both Panda and Penguin several times. These updates refined the two different technologies to try to increase the quality of pages ranking in the SERPs. On July 18, a Panda update was thought to have been released to “soften” the effects of a previously released Panda, so Google obviously watched the effects of its updates, and modified them accordingly.

 

In June, Google released the “Payday Loan” update. This targeted niches with notoriously spammy SERPs. These niches were often highly commercial, which offered great rewards for any page that could rank highly. Needless to say, spammers loved sites like these. Google gave the example of “payday loans” as a demonstration when announcing this update, hence its name.

 

In July, we saw an expansion of the Knowledge Graph, and in August there was another major update announced by Google.

August 2013 – Hummingbird – Fast & Accurate?

Hummingbird was the name given to Google’s latest search algorithm. It was not part of an existing algorithm or a minor algorithm update, but an entirely brand-spanking new algorithm that was unboxed and moved into place on August 20, 2013 (though it was not announced to the SEO community until September 26).

 

This was a major change to the way Google sorted through the information in its index. In fact, a change on this scale had probably not occurred for over a decade. Think of it this way. Panda and Penguin were changed to parts of the old algorithm, whereas Hummingbird was a completely new algorithm, although it still used components of the old one.

 

Google algorithms are the mathematical equations used to determine the most relevant pages to return in the search results. The equation uses over 200 components, including things like PageRank and incoming links, to name just two.

 

Apparently, the name Hummingbird was chosen because of how fast and accurate these birds were. Although many webmasters disagreed, Google obviously thought at the time that this reflected their search results – fast and accurate.

 

Google wanted to introduce a major update to the algorithm because of the evolution in the way people used Google to search for stuff. An example Google gave was in “conversation search”, whereby people could now speak into their mobile phone, tablet or even a desktop browser to find information. To illustrate, let's say that you were interested in buying a Nexus 7 tablet. The old way of finding it online was to type something like this into the Google search box:

 

"Buy Nexus 7" However, with the introduction of speech recognition, people have since become a lot more descriptive in what they are searching for. Nowadays, it’s just as easy to dictate into your search browser something like: “Where can I buy a Nexus 7 near here?”

 

The old Google could not cope too well with this search phrase, but the new Hummingbird was designed to do just that. The old Google would look for pages in the index that included some or all of the words in the search phrase. A page that included the exact phrase would have the best chance of appearing at the top of Google. If no pages were found with the exact phrase, then Google would look for pages that included the important words from it, e.g. “where” “buy” and “Nexus 7”.

 

The idea behind Hummingbird was that it should be able to interpret what the searcher was really looking for. In the example above, they are clearly looking for somewhere near their current location where they can purchase a Nexus 7.

 

In other words, Hummingbird was supposed to determine searcher "intent" and return pages that best matched that intent, as opposed to best matching keywords in the search phrase. Hummingbird is still around today and tries to understand exactly what the searcher wants, rather than just taking into account the words used in the search term.

 

In December 2013, there was a drop in the authorship and rich snippets displayed in the SERPs. This was a feature where Google displayed a photo of the author and/or other information next to the listing. However, Google tightened up their search criteria and removed these features from listings.

 

Google Changes in 2014

Google updated their page layout update

In February 2014, Google updated their page layout update.

 

In May the same year, Payday Loan 2.0 was released. This was an update to the original Payday Loan algorithm and was thought to have extended the reach of this algorithm to international queries. Also in May, Panda was updated, called Panda 4.0).

 

That brings us up to the current time, as I am writing this blog. As you can see, Google has been very active in trying to combat the spam thrown at them. The two major updates that most webmasters worried about, and continue to worry about, are Panda and Penguin. Together, these two technologies weed out low-quality pages, and pages that have been engineered to rank highly in the search engines. But as previously noted there are sometimes a few innocent victims that lose out to these updates too, especially in the case of major updates.

 

That last statement sends shivers down the spines of many webmasters.

 

Anyone who builds a website will want it to rank well in Google. Without having a high-profile presence in the SERPs, a site won’t get much traffic, if any at all. If that happens, webmasters WILL try to boost their rankings, and the traditional way is by working on the “on-page SEO” and inbound links. However, over the last couple of years, in particular, Google has introduced measures that aim to penalize any webmaster who is actively trying to boost rankings via traditional SEO methods.

 

Google wants the best pages to rank at the top of the SERPs for obvious reasons. So Google rewards the pages that deserve to be at the top, rather than pages that webmasters force to the top using SEO (much of which Google collectively calls “webspam”).

 

What this means to you is that you have to deliver the absolute best quality content you can. You need to create content that deserves to be at the top of the SERPs. You know, content is not only King now, but it has always been King. The difference now is that Google algorithms are so much better at determining what constitutes great content and what doesn't. In other words, it's no longer easy to take shortcuts and use underhanded tactics to trick the algorithms as it once was.

 

Fortunately for you, Google offers a lot of advice on how to create the type of content they want to show up in their SERPs. In fact, they have set up a web page called “Webmaster Guidelines”. This tells you exactly what they want, and just as importantly, what they don’t want. We’ll look at this shortly, but first, let’s see how we used to create content in the past.

 

Google Algorithm Changes in 2015

Google Algorithm Changes

The Mobile-friendly Update

On April 21, Google began rolling out an update that was designed to boost mobile-friendly web pages in their mobile search results. To help webmasters prepare for the update, Google provided a web page where webmasters could test their site to see if it was mobile-friendly or not. You can find the mobile-friendly testing tool here: https://www.google.com/webmasters/tools/mobile-friendly/ To use this tool, you simply enter your webpage URL and wait for the results.

 

Hopefully, you will see something like this:

  • The mobile-friendly update:
  • 1. Only affects searches carried out on mobile devices.
  • 2. Applies to individual pages, not entire websites.
  • 3. Affects ALL languages globally.

 

This update makes a lot of sense. If someone is searching on a small screen, Google only wants to show web pages that will display properly on such devices.

 

Where Do All of These Updates Leave Us?

Today, optimizing for specific keyword phrases has not just become difficult because of Panda & Penguin. It has now become less effective in driving traffic if those phrases do not match the intent of the searcher who is typing into Google.

 

Out of all the aforementioned Google updates, the one that had a big influence on my own SEO was Penguin. Google sent their Penguin into places that no SEO professional ever imagined they would dare to send it. Many will remember April 24, 2012, as the day the world of SEO changed forever.

 

It was also the day that inspired me to release the first edition of this blog, entitled “SEO 2012 & Beyond – SEO will never be the same again”. Google created such a major shift in the way they analyzed web pages that I now think in terms of Pre-Penguin and Post-Penguin SEO, and that will likely come across in this blog.

 

SEO Today in 2018

On-page SEO

Today, SEO is very different from even a couple of years ago. Google has put a number of measures in place to combat the manipulative actions of webmasters.

 

Ask a bunch of webmasters to define the term SEO and I bet you’ll get a lot of different answers. Definitions will vary depending on the type of person you ask, and even when you ask them. SEO before Google introduced the Panda update was easy. After the Panda update, it was still relatively easy, but you needed to make sure your content was good. After Google released the Penguin update, SEO suddenly became a whole lot harder.

 

In 2018, phrases you’ll hear being used by SEO experts will include:

  • On-page SEO
  • Off-page SEO
  • Link building
  • White hat SEO
  • Grey hat SEO
  • Blackhat SEO

 

White Hat SEO – approved strategies for getting your page to rank well. Google offers guidelines to webmasters which spell out approved SEO strategies.

 

Black Hat SEO – these are the “loopholes” that Google are actively seeking out and penalizing for. They include a whole range of strategies from on-page keyword stuffing to backlink blasts using software to generate tens of thousands of backlinks to a webpage.

 

Grey Hat SEO – Strategies that lie between the two extremes. These are strategies that Google do not approve of, but are less likely to get your site penalized than black hat SEO. Grey hat tactics are certainly riskier than white hat SEO, but not as risky as black hat.

 

If you think of this as a sliding scale from totally safe “White Hat” SEO to totally dangerous “Black Hat” SEO, then you can imagine that as you move to the right with your SEO, you are more likely to get yourself into hot water with Google (at least in the long term). As you move more to the left with your SEO,

 

you are more likely to be safer with Google. Google’s tolerance level is somewhere in the grey hat area. Staying somewhere in the middle may give you better returns for your time, but you do risk a penalty eventually. Before Panda & Penguin, most webmasters knew where these lines were drawn and took their risks accordingly.

 

When Google introduced Panda, the only real change was that webmasters needed to make sure their website content was unique, interesting to visitors, and added something that no other webpage on the same topic provided. No small task, but to beat Panda, the goal was to create excellent content.

 

When Google introduced Penguin, they completely changed the face of SEO, probably forever, or at least for as long as Google continues to be the dominant search engine (which probably amounts to the same thing).

 

We’ve still got the safe “White Hat” SEO and unsafe “Black Hat”. What’s really changed is the “Grey Hat” SEO. Whereas previously it was reasonably safe, it’s now become a lot riskier.

 

This tolerance line can move from left to right at any time, depending on how Google tweak their algorithm. If they want to come down hard on “spammers”, they’ll move the line more to the left. If too many good sites get taken out as “collateral damage”, as they call it, then they may move the tolerance line over to the right a bit (see the section later on “Trust Vs No-Trust”).

 

Generally though, for all new sites and most others, the tolerance level is very close to the White Hat boundary, and this is what you need to be mindful of. Webmasters who use techniques which are further to the right of this tolerance line risk losing their rankings.

 

Although these diagrams are good to work from, they do not display the whole truth. Let’s just consider how trust changes the equation.

 

Trust Vs No-trust

The Google Tolerance Line will slide left or right depending on the site that is being ranked. For a site that has proven its worth, meaning Google trusts it a lot; we might see the diagram look like this:

 

A "Trusted" Site

Yet for a new site or one that has no track record to speak of, the diagram will probably look a lot more like this.

 

A "Non-trusted" Site

spammy SEO

The only difference here is in the location of the “tolerance line”.

 

In other words, Google is a lot more tolerant of sites which have built up authority and trust that they are of new sites or sites which have not been able to attain a decent level of authority or trust.

 

A high authority site with lots of trusts can withstand a certain amount of spammy SEO without incurring a penalty (see later when we discuss “negative SEO”). In other words, the more authority the site has, the more it can endure, or get away with.

 

A new site, on the other hand, would be quickly penalized for even a small amount of spammy SEO.

 

Webmasters Living in the Past

A lot of webmasters (or SEO companies vying for your business) may disagree with my take on modern-day SEO, and that’s fine. The more people there are who just don’t get it, the less competition there is for me and my clients.

 

I am sure you can still find people who will say this is all rubbish and that they can get your pages ranked for your search terms (depending on the severity of competition of course) by heavily back-linking the page using keyword rich anchor text. The process they’ll describe is eerily similar to the process I told you about in the section “How we used to rank pages”, i.e. before 2010. It goes something like this:

 

  • 1.Keyword research to find high demand, low competition phrases.
  • 2. Create a page that is optimized for that keyword phrase.
  • 3. Get lots of backlinks using that keyword phrase as anchor-text (the clickable text in a hyperlink).
  • 4. Watch your page rise up the SERPs.

 

If you don’t care about your business, then follow that strategy or hire someone to do it for you. You might get short-term gains, but you’ll run the risk of losing all your rankings further down the line when Google catches up with you (and catch up it will… probably sooner rather than later).

 

To lose all of your rankings on Google does not take a human review, though Google does use human reviewers on some occasions. No, getting your site penalized is much quicker and less hassle than that of human intervention. The process Google created for this is far more “automated” since the introduction of Panda and Penguin. Go over the threshold levels of what is acceptable, and the penalty is algorithmically determined and applied.

 

The good news is that algorithmic penalties can just as easily be lifted by removing the offending SEO and cleaning up your site. However, if that offending SEO includes low-quality backlinks to your site (especially to the homepage), things become a little trickier.

 

Remember the SEO expert you hired that threw tens of thousands of backlinks at your site using his secret software? How can you get those backlinks removed? In most cases, it's not easy, although Google does provide a “Disavow” tool that can help in a lot of instances. I’ll tell you more about that later in the blog. In extreme circumstances, moving the site to a brand new domain and starting afresh may be the only option.

 

In the rest of this blog, I want to focus on what you need to do to help your pages rank better. I will be looking mainly at white-hat strategies, though I will venture a little into grey hat SEO as well. I won’t be covering black-hat SEO at all though. This is because it's just not a long-term strategy.

 

Remember, with the flick of a switch, Google can, and sometimes does move the goal posts, leaving you out in the cold. Is it worth risking your long-term business plans for short-term gains? OK, let’s now get started with the four pillars of post-Penguin SEO.

 

The Four Pillars of Post-Penguin SEO

SEO strategies

I have been doing SEO for over 10 years now and have always concentrated on long-term strategies. That’s not to say I haven’t dabbled in black hat SEO because I have, a little. Over the years, I have done a lot of experiments on all kinds of ranking factors. However, and without exception, all of the sites I promoted with black hat SEO have been penalized; every single one of them.

 

In this blog, I don’t want to talk about the murkier SEO strategies that will eventually cause you problems, so I’ll concentrate on the safer techniques of white hat SEO.

I have divide SEO into four main pillars. These are:

  • 1.Quality content
  • 2.Site organization
  • 3.Authority
  • 4. What’s in it for the visitor?

 

These are the four areas where you need to concentrate your efforts, so let’s now have a look at each of them in turn.

 

Quality Content

Quality Content

Before we begin, let me just say that a lot of SEO “experts” have disagreed with me on the need for quality content as a major ranking factor. They will often cite exceptions in the search engines where pages rank well for competitive terms without much on-page content or with the content of very little value. The fact that these exceptions rarely last more than a week or two eludes them.

 

Remember, bounce rates and time spent on a site are indicators to Google of whether a page gives a visitor what they want, or not. Poor content cannot hide in the Google top 10. Search engine users essentially vote out content that falls short on quality. They do this by their actions, or inactions, as the case may be.

 

They vote by the length of time they stay on any particular site or page, and whether or not they visit any other pages while on that site (the latter examining overall site reputation). Additionally, Google also looks at what they do once they hit the back button and return to the SERPs. It’s quite an eye-opener.

 

What Google Tells Us About Quality Content

What Google Tells Us About Quality Content

Google’s Webmaster Guidelines offer advice to webmasters. They tell us what Google considers "good content", and what they consider as spam. Let’s take a quick look at what the guidelines say about website content.

 

1. Create a useful, information-rich site This one should be obvious. Create content that your visitors want to see.

 

2. Think about the words people use to search ...and try to incorporate them into your pages.

 

This particular guideline is one that I think will soon disappear. The reason I think this is because it's an open invitation for webmasters to fill their pages with keywords and phrases. Focusing on keywords usually results in a web page optimized for the search engines at the expense of the visitor’s experience. Google openly labels that type of page as Webspam.

 

I actually think this guideline is probably remaining from the pre-Panda days. The best advice is to think about the words people use to search, the synonyms of those words, and the searcher's intent. Questions you will want to ask include: what is the searcher looking for exactly? What words, phrases, and topics do my page need to satisfy that intent?

 

Try to use text instead of images for important names, content or links

Google crawler text images

Google’s crawler does not recognize text in images, so if there is an important word or phrase that you need Google to know exists in any part of your page, use text instead. If it has to be an image for whatever reason, then use the ALT tag to create a description of the image which includes that phrase.

 

A great example of using text instead of images is in links on your site. Links are the most important factor in SEO, so the more the search engines know about the nature of a link the better. If you are linking to a page on your site about “purple”, then use purple as the link text. This will tell Google that you (the webmaster) know the page you are linking to looks at purple.

 

These internal links also help rank a page for a specific term. Internal links are a safer way to get keyword rich anchor text into a link (something that is not so safe coming from other websites). You will see more about this in the section on "back-linking" later in the blog.

 

Make <title> and ALT attributes descriptive

ALT attributes descriptive

The <title> tag of your page is one of the most important areas in terms of SEO. Try to get your most important keyword(s) in there, but do not stuff keywords into the title. The title tag has two main objectives; one is to entice the searcher to click through from the search engines, and the other is to tell Google what your page is all about.

 

We talked about click-through rates (CTR) earlier in the blog and looked at how important a factor they are in modern SEO. If you find that a page has a low CTR (something you can check in Google webmaster tools), then tweak the title to see if you can make it more enticing to searchers, and then monitor the CTR of the page in the coming days and weeks. Just know that better titles (ones that match the searcher’s intent) will attract more clicks in the Google SERPs.

 

We talked about click-through rates (CTR) earlier in the blog and looked at how important a factor they are in modern SEO. If you find a page has a low CTR (something you can check in Google webmaster tools), then tweak the title to see if you can make it more enticing to searchers. You then need to monitor the CTR of the page in the coming days and weeks. Just know that better titles (ones that match the searcher’s intent) will attract more clicks in the Google SERPs than ones that don't.

 

NOTE: A page listed in the search results also contains a short description under its title. Google sometimes uses the text from the Meta Description tag of your web page for this. The description is typically a 360-character snippet used to summarize a web page's content. Search engines like Google may use these text snippets in the search results as a way to let visitors know what your page is about.

 

You can do the same tweaking of the Meta Description tag as you do with the title tag to help increase CTR. However, this is less of an exact science since Google will sometimes grab text extracts from elsewhere on the page for the listing’s description.

 

Let's now look at ALT tags. These are there for a very specific purpose. Their primary objective is to help people who have images turned off, for example, people with impaired vision. Often they will use text-to-voice software to read pages, so your ALT tags should help them with this. An ALT tag should describe the image to those users in a concise way. It is also a great place to insert a keyword phrase or synonym, but NEVER use ALT tags just as a place to stuff keywords.

 

 

How to stay ahead of Google’s updates

Every now and then, Google releases a significant update to their algorithm, which can have a massive impact on businesses from any industry. To hone your SEO chops and make sure your site doesn't fall into Google's bad blogs, it's important to stay informed of Google’s updates as they are released.

 

Fortunately, almost every time a major update is released, those updates are reported on by the entire SEO community and sometimes publicly discussed and confirmed by Google staff.

 

A long-extended history of Google’s updates would fill this entire blog, but with the resources below, you can stay abreast of new Google updates as they are rolled out. This is essential knowledge for anyone practicing SEO, at a beginner or an advanced level. You can even keep your ear to the ground with these sources and often be forewarned of future updates.

 

Google Updates by Search Engine Round Table Google PageRank & Algorithm Updates

Search Engine Round Table is one of the industry’s leading blogs on SEO. On the page above, you can browse all of the latest articles on Google updates by a leading authority on the topic.

 

Search Engine Journal

Actionable SEO Tips and Strategies That Work | SEJ

Search Engine Journal is another authoritative, relevant and frequently updated publication about everything SEO. An indispensable resource for keeping abreast of industry events as they happen.

 

Moz Blog

Moz Blog

Moz Blog - SEO and Inbound Marketing Blog - Moz

The Moz blog is mentioned several times in this blog and for good reason—it’s among the leading authority blogs covering all things SEO, and if there’s an impending update on the radar, you will catch wind of it here.

  • Keyword research. The most important step of SEO.
  • Why is keyword research so important?

 

Keyword research is the most important step in every SEO project for two reasons:

1. If you rank your site highly for the wrong keywords, you can end up spending lots of time and effort, only to discover the keywords you have targeted doesn't receive any traffic.

 

2. If you haven't investigated the competitiveness of your keywords, you can end up investing lots of time and effort into a particular keyword, only to find it is far too competitive to rank, even on the first page.

 

These two pitfalls are often the ultimate decider on how successful any SEO project is. This blog will cover how to avoid these pitfalls and how to find the best keywords. First, we must define what a keyword is.

 

What exactly is a keyword

What exactly is a keyword

 

If you are an SEO newbie, you may be wondering—what is a keyword

A keyword is any phrase you would like your site to rank for in Google's search results. A keyword can be a single word, or a keyword can also be a combination of words. If you are trying to target a single word, look out! You will have your work cut out for you. Single word keywords are extremely competitive, and difficult to rank highly for in the search results.

 

Here are some different kinds of keywords

  • Head-term keywords: keywords with one to two words, i.e. classic movies.
  • Long-tail keywords: keywords with three or more phrases, i.e. classic Akira Kurosawa movies.
  • Navigational keywords: keywords used to locate a particular brand or website.

 

Examples would be Facebook, YouTube or Gmail.

Informational keywords: keywords used to discover on a particular topic. This includes keywords beginning with “how to…” or “what are the best...” Transactional keywords: keywords entered into Google by customers wanting to complete a commercial activity, i.e. buy jackets online.

 

In most cases, targeting head-term or navigational keywords for other brands is competitive and not worth the time or effort. Despite their high traffic numbers, they will generally not lead to any sales. On the other hand, long-tail, informational and transactional keywords are good keywords for most SEO projects. They will lead to more customers.

 

How to generate a massive list of keywords

How to generate a massive list of keywords

There are many ways to skin a cat. The same is true for finding the right keywords. Before you can find keywords with loads of traffic on Google, you must first develop a list of potential keywords relevant to your business.

 

Relevance is vital

If you spend your time trying to cast too wide a net, you can end up targeting keywords irrelevant to your audience.

For example, if you are an online football jacket retailer in the United States, examples of relevant keywords might be:

  • Buy football jackets
  • Buy football jackets online

 

You can see how the first pool of keywords are more relevant to the target audience of football jacket retailers, and the second pool of keywords are related but unlikely to lead to customers.

 

Keeping relevance in mind, you must develop a list of potential keyword combinations to use as a resource, so you can then go and uncover the best keywords with a decent amount of traffic each month in Google.

 

Following are some powerful strategies you can use to help with generating this list.

1. Steal keywords from competitors.

If you're feeling sneaky, you can let your competitors do the heavy lifting for you and snatch up keywords from their sites.

 

There are many tools out there created for this sole purpose. A simple and free tool is the SEO blog Keyword Analyzer. If you enter a page into this tool within seconds it will scrape a list of the keywords your competitor has optimized into their page. You can then use this to bulk out your keyword list.

 

SEO blog Keyword Analyzer Free Keyword Density Analyzer Tool

While the SEO blog Keyword Analyzer is a great, simple tool for revealing the keywords your competitors have optimized into the page, another powerful tool is Ahrefs Organic Keywords report. This tool estimates the keywords that are sending the largest amount of traffic to your competitors’ websites. The estimates are reasonably accurate and can be a valuable resource for bulking out your keyword lists.

 

While Ahrefs reports are powerful, they do come at a cost. You can preview the first 20-keywords for free, but if you want more data, they currently offer a 7-day trial for $7, and after an initial trial, monthly billing starts at $99 per month.

 

Brainstorm your own master list

Assuming competitors have been through with their research isn't always the best strategy. By brainstorming combinations of keywords, you can generate a giant list of potential keywords.

 

To do this, sketch out a grid of words your target customer might use. Split the words into different prefixes and suffixes. Next up, combine them into one giant list using the free Mergewords tool. With this strategy, you can quickly and easily build up a massive list of relevant keywords.

 

How to find keywords that will send traffic to your site

How to find keywords that will send traffic to your site

Now you have a list of keywords, you need to understand how much traffic these keywords receive in Google. Without search traffic data, you could end up targeting keywords with zero searches. Armed with the right knowledge, you can target keywords with hundreds or even thousands of potential visitors every month.

 

Unfortunately, in recent years Google has restricted access to the data behind Google's search box, leaving us with two options for finding keyword traffic data.

 

Firstly, if you have an AdWords campaign running with Google and are already spending a modest amount, then you’re in the clear, you can access this info for free in their Google AdWords Keyword Planner tool. If this isn’t you, the other option is to use a paid keyword research tool for a small monthly fee, such as Keyword Tool: #1 Google Keyword Planner Alternative For SEO (FREE).

 

As a result of Google making search data unavailable to free users, free keyword tools disappeared from the market, making paid research tools the only viable option for finding traffic data for keywords these days.

 

If you're on a tight budget, then you can sign up for a paid plan with one of the many paid keyword research tools on the market then ask for a refund after doing your research. It's not nice, but it's an option—either way, you need the traffic data behind your keywords otherwise you are running blind.

 

Estimating keyword traffic data with Google’s Keyword Planner

 

Google’s Keyword Planner

 

Google Adwords Keyword Planner Keyword Research & Strategy with Keyword Planner

As mentioned, to access all the juicy traffic data provided by the Google AdWords Keyword Planner tool, you need an active Adwords campaign running, and must be spending at least a modest amount of money regularly.

 

If this is you, sign in, click on Tools in the top menu, click on “Keyword Planner” then click on “Get search volume data and trends”, copy and paste your keywords into the box. Select your country, and then click the blue “Get search volume” button. When finished, you will have the exact amount of times each keyword was searched for in Google.

Mmm. Fresh data. This is just the kind of data we need. Now we know which keywords receive more searches than others, and more importantly, we know which keywords receive no searches at all.

 

Estimating keyword traffic data with a paid tool like KWFinder

 

KWFinder Keyword research and analysis tool

If you want a research tool with a stronger SEO focus, then you can use a paid tool such as KWFinder. I like KWFinder for its ease of use, relevant keyword suggestions, and competitive data, but you're not limited to this tool— there are many alternatives floating around you can find with a couple of Google searches.

 

Using KWFinder as an example, after creating an account, simply log in, select the local area you are targeting (i.e. Los Angeles, California, if that is your customer focus), enter your keyword ideas and download the juicy data. Now you can ensure you spend time focusing on keywords with traffic potential, as opposed to chasing after keywords with no traffic and little opportunity for growing your business.

 

How to find keywords for easy rankings

How to find keywords for easy rankings

Now you need to find out how competitive your desired keywords are. Armed with an understanding of the competitiveness of your keywords, you can discover keywords you can realistically rank for in Google.

 

Let’s say you are a second-hand bookseller and you want to target “blog store online”. It's unlikely you are going to beat Amazon and Barnes and Noble. But, maybe there’s a gem hiding in your keyword list few people are targeting— maybe something like “antique blog stores online”.

 

You have the advantage if your competitors haven't thought of targeting your keyword. You simply have to do better SEO than they are doing and you have a really good chance of beating their rankings. Part of this includes having a large keyword list for your research.

 

Next, you need to wash this list and separate the ridiculously competitive keywords from the easy keywords no one are aggressively targeting. There are many schools of thought on how to find the competitiveness of your keywords. The most popular practices are listed below, with my thoughts on each.

 

1. Manually going through the list, looking at the rankings, and checking if low-quality pages are appearing in the top results.

This is good for a quick glance to see how competitive a market is. However, unreliable and you need real data to rely on.

 

2. Look at how many search engine results are coming up in Google for your keyword.

The amount of results is listed just below the search box after you type in your keyword. This tactic is common in outdated courses teaching SEO, but completely unreliable.

 

The reason? There may be a very low number of competing pages for a particular keyword, but the sites ranked at the top of the results could be unbeatable.

 

3. Using the competition score from the Google AdWords Keyword Research tool.

Don't be tempted. This is a common beginners mistake, and sometimes recommended as an easy way to judge SEO competitiveness for keywords on some blogs, and it just simply doesn't work!

 

The competition score included in the Google AdWords Keyword Research tool is intended for AdWords advertising campaigns only. It is an indication of how many advertisers are competing for the particular keyword through paid advertising. Completely irrelevant for SEO.

 

4. Using a competitive analysis tool, such as KWFinder’s SEO Difficulty report.

 

To get a realistic idea of your chances for ranking high for a particular keyword, you need to understand the strength of the pages currently ranking in the top-10 search results for that keyword.

 

A great tool for this is KWFinder’s SEO Difficulty report. With KWFinder’s SEO Difficulty report, simply enter your keyword into their tool, click “check difficulty”, and it will show vital stats for pages appearing in the top 10.

 

Of these stats, the most important are Domain Authority, Page Authority, Links, and Facebook Shares… If you don’t have a high Domain Authority or Page Authority—don’t freak out. If your site is more relevant to the topic, you can often nudge your way up the results by focusing on building up backlinks to your page and improving your social media activity, especially if those stronger sites have little amounts of links and social activity on their pages, and are non-specific, generic directory or aggregator-type sites.

 

Next up, if you enter your own website into Ahref’s Site Explorer tool, you can see the same stats for your site, and set targets for beating the competition.

 

Ahrefs – Competitor Research Tools & SEO Backlink Checker

Ahrefs – Competitor Research Tools

Armed with this knowledge, you can hunt around to find keywords with reasonable levels of traffic, weak competition, and set targets for how many links you need for a top listing. You can even find keywords competitors are using, estimates of how much traffic they are getting from those keywords, even where they are getting their links from!

 

There’s many keyword tools and site analysis tools which can be found with a couple of Google searches. Every SEO professional ultimately has a different favorite tool they prefer, the following tools are well known in the field and I often use myself. Moz - Keyword Explorer https://moz.com/explorer/keyword/

 

Moz – Open Site Explorer

Moz – Open Site Explorer

https://moz.com/researchtools/ose/

 

When finished reading this blog, you can work through the keyword research points in the free SEO checklist included at the end of the blog, with the above process outlined in a step-by-step approach.

 

On-page SEO. How to let Google know what your page is about.

On-page SEO

On-page SEO is the process of ensuring that your site is readable to search engines. Learning correct on-page SEO is not only important in ensuring Google picks up the keywords you want, but it is an opportunity to achieve easy wins and improve your website’s overall performance.

 

On-page SEO includes the following considerations:

  • 1. Making sure site content is visible to search engines.
  • 2. Making sure your site is not blocking search engines.
  • 3. Making sure search engines pick up the keywords you want.

Most on-page SEO you can do yourself if you have a basic level of experience dealing with sites.

 

If you are not technically inclined, please note there are technical sections of this blog. You should still read these so you understand what has to be done to achieve rankings in Google, you can easily hire a web designer or web developer to implement the SEO techniques in this blog after you know what it takes to achieve top rankings.

 

How to structure your site for easy and automatic SEO.

These best practices will ensure your site is structured for better recognition by Google and other search engines.

 

Search engine friendly URLs

Search engine friendly URL

Have you ever visited a web page and the URL looked like something like this: http://www.examplesite.com/~articlepage21/post-entry321.asp?q=3. What a mess!

These kinds of URLs are a quick way to confuse search engines and site visitors. Clean URLs are more logical, user-friendly and search engine friendly. 

Here is an example of a clean URL: http://www.examplesite.com/football-jerseys Much better.

 

Take a quick look at Google's search engine results. You will see a very large portion of sites in the top 10 have clean and readable URLs like the above example. And by a very large portion… I mean the vast majority.

 

Most site content management systems have search engine friendly URLs built into the site. It is often a matter of simply enabling the option in your site settings. If your site doesn't have search engine friendly URLs, it's time for a friendly chat with your web developer to fix this up.

 

Internal navigation

Internal navigation

There is no limit on how to structure the navigation of your site. This can be a blessing or a curse.

 

Some people force visitors to watch an animation or intro before they can even access the site. In the process, some sites make it harder for visitors and more confusing for search engines to pick up the content on the site.

 

Other sites keep it simple by having a menu running along the top of the site or running down the left-hand side of the browser window. This has pretty much become an industry standard for most sites. By following this standard, you make it significantly easier for visitors and search engines to understand your site. If you intend to break this convention, you must understand it is likely you will make it harder for search engines to pick up all of the pages on your site.

 

As a general rule, making it easier for users makes it easier for Google. Above all else, your website navigation must be made of real text links—not images. If your main site navigation is currently made up of images, slap your web designer and change them to text now! If you do not have the main navigation featured in the text, your internal pages will almost be invisible to Google and other search engines.

 

For an additional SEO boost, include links to pages you want visible to search engines and visitors on the home page. By placing links specifically on the home page, Google's search engine spider can come along to your site and quickly understand which pages on your site are important and worth including in the search results.

 

How to make Google pick up the keywords you want

How to make Google pick up the keywords

There are many misconceptions being circulated about what to do, and what not to do when it comes to optimizing keywords into your page. Some bloggers are going so far as telling their readers to not put keywords in the content of targeted pages at all. These bloggers—I'm not naming names—do have the best intentions and have really taken worry about Google's spam detection to the next level. But it is madness.

 

Not having keywords on your page makes it difficult for Google to match your page with the keyword you want to rank for. If Google completely devalued having keywords on the page, Google would be a crappy search engine.

 

Think about it. If you search for “Ford Mustang 65 Auto Parts” and arrive on pages without those words on the page at all, it's highly unlikely you have found what you’re looking for.

 

Google needs to see the keywords on your page, and these keywords must be visible to your users. The easy approach is to either create content around your keyword or naturally weave your keyword into the page. I'm not saying your page should look like the following example.

 

“Welcome to the NFL jersey store. Here we have NFL jerseys galore, with a wide range of NFL jerseys including women’s NFL jerseys, men's NFL jerseys, and children's NFL jerseys and much, much more.”

 

This approach may have worked 10 years ago, but not now. The keyword should appear naturally in your page. Any attempts to go bonkers with your keywords will look horrible and may set off spam filters in search engines. Use your keyword naturally throughout the content. Repeating your keyword once or twice is more than enough. It's really that simple.

 

Next up, you need to ensure you have a handful of LSI keywords on your page. LSI stands for Latent Semantic Indexing. Don’t be discouraged by the technical term, LSI keywords is an SEO term for related phrases. Google believes a page is more naturally written and has a higher tendency to be good quality and relevant if it also includes relevant and related keywords to your main phrase.

 

To successfully optimize a page, you need to have your main keywords and related keywords on the page. Find two or three related keywords to your main keyword, and repeat these in the page once or two times each. Ubersuggest is a great tool for finding keywords Google considers related to your main keywords —it does this by pulling suggestions from Google’s auto-suggest box. Use Ubersuggest and your keyword research to determine a list of the most related keywords.

 

Areas you can weave keywords into the page include:

  • Meta description and meta title tags
  • Navigation anchor text
  • Navigation anchor title tags
  • Headings (h1, h2, h3, and h4 tags)
  • Content text
  •  Bolded and italicized text
  • Internal links in content
  • Image filename, image alt tag, and image title tag
  • Video filename, video title

 

How to get more people clicking on your rankings in Google

rankings in Google

Meta tags have been widely misunderstood as mysterious pieces of code SEO professionals mess around with, and the secret to attaining top rankings. This couldn't be further from the truth.

The function of meta tags is really quite simple. Meta tags are bits of code on your site controlling how your site appears in Google.

 

If you don't fill out your meta tags, Google will automatically use text from your site to create your search listing. This is exactly what you don't want Google to do, otherwise, it can end up looking like gibberish! Fill out these tags correctly, and you can increase the number of people clicking to your site from the search engine results.

 

Below is an example of the meta tag code.

<title>Paul’s NFL Jerseys</title>

<meta description=”Buy NFL jerseys online. A wide range of colors and sizes.”/> <meta name="robots" content="n, n"/>

Below is an example of how a page with the above meta tag should appear as a search engine results in Google:

Paul's NFL Jerseys

Buy Paul's NFL jerseys online. A wide range of colors and sizes. http://www.yoursite.com/

Pretty simple, huh?

 

The title tag has a character limit of roughly 70 characters in Google. Use any more than 70 characters and it is likely Google will truncate your title tag in the search engine results.

 

The meta description tag has a character limit of roughly 360 characters. Just like the title tag, Google will shorten your listing if it has any more than 360 characters in the tag.

 

The last meta robots tag indicates to Google you want to control how your listing appears in the search results. It’s good to include this, while unlikely, it’s possible Google can ignore your tags and instead use those listed on other directories such as the Open Directory Project and the Yahoo Directory.

 

To change these tags on your site you have three options:

 

1. Use the software your site is built on. Most content management systems have the option to change these tags. If it doesn't, you may need to install an SEO plugin to change these tags.

 

2. Speak with your web designer or web developer to manually change your Meta tags for you.

 

3. If you are a tech-savvy person and are familiar with HTML, you can change these tags in the code yourself.

 

Site load speed—Google magic dust.

How fast (or slow) your site loads is a strong factor Google takes into account when deciding how it should rank your pages in the search results.

 

Google’s former head of web spam, Matt Cutts, publicly admitted fast load speed is a positive ranking factor.

 

If your site is as slow as a dead snail, then it is likely your site is not living up to its potential in the search engines. If your site load time is average, improving the load speed is an opportunity for an easy SEO boost.

 

Not only is load speed a contributing factor to achieving top rankings in Google, but extensive industry reports have also shown for each second shaved off a site, there is an average increase of 7% to the site conversion rate. In other words, the faster your site loads, the more chance you have of people completing a sale or filling out an inquiry form. Clearly, this is not an aspect of your site to be overlooked.

 

Fortunately, there are a handful of tools that make it easy to improve your load speed.

 

1. Google Page Speed Insights https://developers.google.com/speed/pagespeed/insights Google's great free tool, Page Speed Insights, will give you a page load score out of 100. You can see how well your load speed compares to other sites. You can also see how well your site loads on mobile and desktop. Scores closer to 100 are near perfect.

 

After running a test on your site, the tool will give you a list of high priority, medium priority and low priority areas for improvement. You can forward these on to your developer to speed up your site, or if you are a bit of a tech-head, you can have a crack at fixing these up yourself.

 

Test My Site - Think With Google Test Your Mobile Website Speed and Performance

Test My Site Think With Google

Around late June 2017 Google updated their mobile load speed testing tool, Test My Site, to include benchmarking reports against industry competitors. This tool is both easy-to-use and indispensable for finding easy-win load speed improvements for mobile users—and handy for seeing how your website performs against competitors.

 

You might be shocked the first time you use this tool—many site owners discover they are losing up to 30%-50% of traffic, due to poor loading time on 3G mobile devices, not a great outlook.

 

Fortunately, the handy tool provides free reports and actionable recommendations on how to supercharge your load speed with a strong focus on improvements for mobile users. If you follow the recommendations and get your site performing better than competitors, you can make out like a bandit in the search results, with load speed being a top ranking factor driving the search results.

 

Pingdom Tools – Website Speed Test http://tools.pingdom.com/

Pingdom Tools

Pingdom Tools Website Speed Test is the cream of the crop when it comes to load speed tools, providing detailed breakdowns of files and resources slowing your site down, listing file-sizes of individual files, server load times, and much more. It goes into much greater depth than the other tools, though probably best suited for a web developer or someone with a basic level of experience building websites.

 

After the test is completed, if you scroll down you will see a list of files each visitor has to download each time they visit your site. Large images are easy targets for load speed improvements. If you have any images over 200kb, these can usually be compressed in Photoshop and shrunk down to a fraction of the size without any quality loss.

 

Take a note of any large files, send them to your web developer or web designer, and ask them to compress the files to a smaller file size. The usual suspects—sitemaps.xml and robots.txt

 

Sitemaps.xml

Search engines automatically look for a special file on each site called the sitemaps.xml file. Having this file on your site is a must for making it easy for search engines to discover pages on your site. Sitemaps are essentially a giant map of all of the pages on your site. Fortunately, creating this file and getting it on to your site is a straightforward process.

 

Most CMS systems have a sitemap file automatically generated. This includes systems like Wordpress, Magento, Shopify. If this is not the case on your site, you may need to install a plugin or use the free XML Sitemaps Generator tool. The XML Sitemaps Generator will automatically create a sitemaps.xml file for you.

 

XML Sitemaps Generator

XML Sitemaps Generator

 

Next ask your web developer or web designer to upload it into the main directory of your site, or do it yourself if you have FTP access. Once uploaded, the file should be publicly accessible with an address like the below example:  http://www.yoursite.com/sitemaps.xml

 

Once you have done this, you should submit your sitemap to the Google Search Console account for your site.

If you do not have a Google Search Console account, the below article by Google gives simple instructions for web developers or web designers to set this up. Add and verify a site to Google Search Console http://support.google.com/webmasters/bin/answer.py?hl=en&answer=34592. Log in to your account and click on your site. Under “site configuration” click “sitemaps”, and submit your sitemap.

 

Robots.txt

Another must-have for every site is a robots.txt file. This should sit in the same place as your sitemaps.xml file. The address to this file should look the same as the example below: http://www.yoursite.com/robots.txt

 

The robots.txt file is a simple file that exists so you can tell the areas of your site you don’t want Google to list in the search engine results. There is no real boost from having a robots.txt file on your site. It is essential you check to ensure you don’t have a robots.txt file blocking areas of your site you want search engines to find.

The robots.txt file is just a plain text document, its contents should look something like below:

 

# robots.txt good example

  • User-agent: *
  • Disallow: /admin
  • User-agent: *
  • Disallow: /logs

 

If you want your site to tell search engines to not crawl your site, it should look like the next example. If you do not want your entire site blocked, you must make sure it does not look like the example below. It is always a good idea to double check it is not set up this way, just to be safe.

  • # robots.txt - blocking the entire site
  • User-agent: *
  • Disallow: /

 

The forward slash in this example tells search engines their software should not visit the home directory.

 

To create your robots.txt file, simply create a plain text document with Notepad if you are on Windows, or TextEdit if you are on Mac OS. Make sure the file is saved as a plain text document, and use the “robots.txt good example” as an indication of how it should look. Take care to list any directories you do not want search engines to visit, such as internal folders for staff, admin areas, CMS back-end areas, and so on.

 

If there aren’t any areas you would like to block, you can skip your robots.txt file altogether, but just double check you don’t have one blocking important areas of the site like the above example.

 

Duplicate content—canonical tags and other fun.

In later blogs, I will describe how Google Panda penalizes sites with duplicate content. Unfortunately, many site content management systems will sometimes automatically create multiple versions of one page.

 

For example, let’s say your site has a product page on socket wrenches, but because of the system your site is built on, the exact same page can be accessed from multiple URLs from different areas of your site:

  • http://www.yoursite.com/products.aspx?=23213
  • http://www.yoursite.com/socket-wrenches
  • http://www.yoursite.com/tool-kits/socket-wrenches

 

In the search engine’s eyes, this is confusing as hell and multiple versions of the page are considered duplicate content. To account for this, you should always ensure a special tag is placed on every page in your site, called the canonical tag.

 

The canonical tag indicates the original version of a web page to search engines. By telling Google the page you consider to be the “true” version of the page into the tag, you can indicate which page you want to be listed in the search results.

 

Choose the URL that is the most straightforward for users, this should usually be the URL that reads like plain English.

Using the earlier socket wrenches example, by using the tag below, Google would be more likely to display the best version of the page in the search engine results. <link rel="canonical " href="http://www.yoursite.com/socket-wrenches "/> As a general rule, include this tag on every page on your site, shortly before the </head> tag in the code.

 

Usability—the new SEO explained

new SEO

Mobiles and tablets have overtaken desktops in the vicious battle for Internet market share, making up 56% of all traffic in 2017. To keep a good experience for all users, Google is increasingly giving advantages to sites providing a good experience for users on all devices. Usability has increased importance in the SEO industry as a result, as many SEO pundits found you can get an advantage simply by making your site easy to use.

 

For example, let’s say a mobile user is searching for late night pizza delivery in Los Angeles. One local business has a site with a large number of backlinks but no special support for mobile users, it’s difficult for the user to navigate because it doesn’t automatically fit the screen, and the menu text is small and hard to use on a touchscreen.

 

Another competing local business has low amounts of backlinks, but good support for mobile users. Its design fits perfectly to the screen and has special navigation designed for mobile users, making it easy to use. In many cases, the second site will now rank higher than the first, for mobile users. This is just one example of how usability can have a significant impact on your rankings.

 

While a term like a usability can understandably seem a little vague, let’s look at practical steps to improve your usability and the SEO strength of your site.

 

Make your site accessible for all devices

Make your site accessible and easy for all users: desktop, mobile, and tablet. The simple way to do this is to make sure your site is responsive, which means it automatically resizes across all devices and has mobile-friendly navigation for mobile users. Mobile support is covered in more detail, in the Mobile SEO Update section in the Google’s Algorithm Updates explain later in this blog, but you can enter your site into the tool below quickly to see if Google registers your site as mobile friendly.

 

Mobile friendly Test Mobile-Friendly Test - Google Search Console

Mobile friendly Test

 

Increase your content quality

Gone are the days of hiring a bunch of writers in India to bulk out the content on your site. It needs to be proofread and edited, and the more “sticky” you make your content, the better results you will get. If you provide compelling content, users will spend more time on your site and are less likely to bounce back to the search results. Users will also be much more likely to share your content. Google will see this and give your rankings a boost.

 

Use clean code in your site

There’s a surprisingly high amount of sites with dodgy code, difficult for both search engines and Internet browsers to read. If there are HTML code errors in your site, which means, if it hasn’t been coded according to industry best practices, it’s possible your design will break when your site is viewed on different browsers, or even worse, confuse search engines when they come along and look at your site. Run your site through the below tool and ask your web developer to fix any errors.

Web standards validator https://validator.w3.org/

 

Take it easy on the popups and advertisements

Sites with spammy and aggressive ads are often ranked poorly in the search results. The SEO gurus have reached no consensus on the number of ads leading to a penalty from Google, so use your common sense. Ensure advertisements don’t overshadow your content and occupy the majority of screen real estate.

 

Improve the overall “operability” of your site

improve search results

 

Does your site have slow web hosting or a bunch of broken links and images? Simple technical oversights like these contribute to a poor user experience.

 

Make sure your site is with a reliable web hosting company and doesn’t go down in peak traffic. Even better, make sure your site is hosted on a server in your local city, and this will make it faster for local users.

 

Next up, chase up any 404-errors with your web developer. 404 errors are errors indicating users are clicking on links in your site and being sent to an empty page. It contributes to a poor user experience in Google’s eyes. Fortunately, these errors are easily fixed.

 

You can find 404 errors on your site by logging into your Google Search Console account, clicking on your site, then clicking on “Crawl” and “Crawl Errors”. Here you will find a list of 404 errors. If you click on the error and then click “Linked From” you can find the pages with the broken links. Fix these yourself, or discuss with your web developer. Google Search Console https://www.google.com/webmasters/tools/

If you want external tools to speed up improving your site’s usability, I have found these two resources helpful:

 

BrowserStack - Free to try, plans start at $29 per month. https://www.browserstack.com

BrowserStack allows you to test your site on over +700 different browsers at once. You can preview how your site works on tablets, mobile devices, and all the different browsers such as Chrome, Firefox, Safari, Internet Explorer, and so on. It’s helpful for making sure it displays correctly across many different devices.

 

Try My UI - Free to try, additional test results start at $35. http://www.trymyui.com

Try My UI provides videos, audio narration, surveys of users going through your site, and reports on any difficulties they uncover. Usability tests are good for larger projects requiring objective feedback from normal users. The first test result is free, making Try My UI a good usability test provider to start with.

 

Google's Search Quality Guidelines—And How to Use Them to Your Advantage

Google's Search Quality Guidelines

Search quality is an increasingly popular topic in the blogosphere because it can have a massive impact on rankings. Why is this so? Making sure users are sent to high-quality and trustworthy search results is critical for Google to safeguard their position as providing the best all-around search experience.

 

While this sounds a little vague, you can use Google's search quality to your advantage and get an edge over competitors. Did you know that Google publicly published their “Search Quality Evaluator Guidelines”, updated on July 27th, 2017? If you didn't, well now you do.

 

The document's 160-pages long, so presuming you don't consider a dense whitepaper leisurely reading, I'll list out the most important and actionable takeaways, so you can use them to your advantage.

 

Google Search Quality Evaluator Guidelines - Most Important Factors

In Google's whitepaper, they list out their holy-trio of most important factors when it comes to search quality. And here it is…

 

EAT... That's right, EAT... Expertise, Authority, and Trust (EAT). Acronym choice aside, to establish quality, Google is looking at the expertise, authority, and trustworthiness of the page and site. This includes things like the content quality and how aggressive ads are on your site. The reputation of the site and its authors, publicly-listed information about site ownership, contact details, and several other factors.

 

Now we know what's important from a top-level perspective, let's zoom into actionable and practical takeaways straight out of the document that will affect the average Joe trying to nudge his way up the search results.

 

Search Quality Evaluator Guidelines—Key Takeaways

Search Quality Evaluator Guidelines

1. Real name, company name, and contact information listed on an about page.

If you don't have this information listed on your website, why should Google, or anyone else for that matter, trust you? Better make sure you include it.

 

2. Excessive and unnatural internal structural links across sidebars and footers. If you've got 150-links in your footer, it's obvious to Google you're trying to do something sneaky, so be conservative with the footer and sidebar links. Keep it restricted to the most important pages on your site or what's useful for your users.

 

3. Over the monetization of content. Specifically, if you are disguising advertisements as main content, or your advertisements occupy more real-estate than the main content than one of Google's search evaluators will probably flag your site as spam. Take a common-sense approach with your ads, don't overdo it!

 

4. List editors & contributors. Are you publishing a bunch of articles under pseudonyms or generic usernames? Listing editors and contributors, i.e. real people, is more trustworthy and will increase the perceived quality of your page.

 

5. Provide sources. Publishing generic articles en masse without any reputable sources? You'll get a better-quality assessment, and a higher ranking, if you list sources for your articles. Listing sources show the writer has performed diligence in their research and increases the credibility of the page.

 

6. Financial transaction pages. All you drop-shippers and e-commerce retailers out there stand up and take note—pages associated with financial transactions (shopping cart, checkout, product pages, etc.) must link to policy pages for refunds, returns, delivery information, and the terms and conditions of your site. Think about it from the user's perspective, if you are average Joe shopper thinking about buying something and the page doesn't list any of this information, how safe would you feel checking out?

 

7. Pages offering financial information must be of the highest quality. Google are stricter with these types of pages, as it falls into their “Your Money or Your Life” category—meaning it could affect the financial well-being of the user. If you're publishing this kind of content, make sure you're doing everything you can to provide high-quality, detailed articles, citing sources, fully disclosing financial relationships, and making it clear what author or company is behind the content.

 

That sums up the most important takeaways from the Google Search Evaluator Guidelines. If you haven't got them in your site, work ‘em in and you'll get a leg up over your competitors, or worse, your rankings could suffer. And if you really, really want to sit down and read through the 160-page whitepaper on page-quality assessment, here it is for your enjoyment.

 

Search Quality Evaluator Guidelines - July 27th, 2017 https://static.googleusercontent.com/media/www.google.com/en//inside