Google Updates (The Complete Guide 2019)

 

Google updates

Google Updates and How to survive them

Fast forward 15 years and ranking in Google has become extremely competitive and considerably more complex. Simply put, everybody wants to be on Google. Google is fighting to keep its search engine relevant and must constantly evolve to continue delivering relevant results to users. This Guide covers Google's major updates in 2019. 

 

This hasn't been without its challenges. Just like keyword stuffing, webmasters eventually clued onto another way of gaming the system by having the most anchor text pointing to the page.

 

This created another loophole exploited by spammers. In many cases, well-meaning marketers and business owners used this tactic to achieve high rankings in the search results.

 

Along came a new Google update in 2019, Google punished sites with suspicious amounts of links with the same anchor text pointing to a page, by completely delisting sites from the search results.

 

Many businesses that relied on search engine traffic lost all of their sales literally overnight, just because Google believed sites with hundreds of links containing just one phrase didn't acquire those links naturally. Google believed this was a solid indicator the site owner could be gaming the system.

 

If you find these changes alarming, don't. How to recover from these changes, or to prevent being penalized by new updates, is covered in later blogs. In the short history of Google's major updates, we can discover two powerful lessons for achieving top rankings in Google.

1. If you want to stay at the top of Google, never rely on one tactic.

 

2. Always ensure your search engine strategies rely on SEO best practices. Authority, trust & relevance. Three powerful SEO strategies explained.

 

Google’s algorithm is complex, but you don’t have to be a rocket scientist to understand how it works. In fact, it can be ridiculously simple if you remember just three principles.

 

With these three principles, you can determine why one site ranks higher than another, or discover what you have to do to push your site higher than a competitor. 

 

These three principles summarize what Google is focusing on in their algorithm now, and are the most powerful strategies SEO professionals are using to their advantage to gain rankings.

 

The three key principles are Trust, Authority and Relevance.

 

Trust

Trust is at the very core of Google’s major changes and updates the past several years. Google wants to keep poor-quality, untrustworthy sites out of the search results, and keep high-quality, legit sites at the top.

 

If your site has high-quality content and backlinks from reputable sources, your site is more likely to be considered a trustworthy source, and more likely to rank higher in the search results.

 

Authority

Authority

Previously the most popular SEO strategy, authority is still powerful, but now best used in tandem with the other two principles. Authority is your site’s overall strength in your market.

 

Authority is almost a numbers game, for example: if your site has one thousand social media followers and backlinks, and your competitors only have fifty social media followers and backlinks, you’re probably going to rank higher.

 

Relevance

Google looks at the contextual relevance of a site and rewards relevant sites with higher rankings. This levels the playing field a bit and might explain why a niche site or local business can often rank higher than a Wikipedia article.

 

You can use this to your advantage by bulking out the content of your site with relevant content and use the on-page SEO techniques described in later blogs to give Google a nudge to see that your site is relevant to your market.

 

You can rank higher with fewer links by focusing on building links from relevant sites. Increasing relevance like this is a powerful strategy and can lead to high rankings in competitive areas.

 

Relevance

How Google ranks sites now—Google’s top 10 ranking factors revealed. You may have wondered if you can find out the exact factors Google uses in their algorithm.

 

Fortunately, there are a handful of industry leaders who have figured it out, and regularly publish their findings on the Internet. With these publications, you can get a working knowledge of what factors Google uses to rank sites.

 

These surveys are typically updated every second year, but these factors don’t change often, so you can use them to your advantage by knowing which areas to focus on.

 

Here’s a short list of some of the strongest factors found in sites ranking in the top 10 search results, in the most recent study by Search Metrics:

  •  Overall content relevance.
  •  Click-through-rate.
  •  Time-on-site.
  •  HTTPS—security certificate installed on site.
  •  Font size in the main content area (presumably people find larger fonts more readable and lead to higher engagement).
  •  A number of images.
  •  Facebook total activity.
  •  Pinterest total activity.
  •  Tweets.
  •  Google+1 activity.
  • A number of backlinks.

 

If your competitors have more of the above features than yours, then it’s likely they will rank higher than you. If you have more of the above features than competitors, then it is likely you will rank higher.

 

The above factors are from the Search Metrics Google Ranking Factors study released in 2019. Regrettably, after releasing the study, Search Metrics said they would stop publishing their search rankings whitepapers in the future.

 

But you can be sure content relevance, user engagement, social activity, links, site security (HTTPS), and most likely mobile support, are among current ranking factors.

 

If you want a deeper look into the study, you can browse the full report by visiting the link below. I cover more recent updates to the algorithm in the Google Algorithm updates in this blog.

 

Search Metrics

Search Metrics

Google Ranking Factors US Ranking Factors Study: Download the Whitepaper and Infographic here

Another well-known authority on the SEO industry, called Moz (previously SEOmoz), releases a rankings study every several years. Moz also publishes this information for free, and available at the following page. Moz Ranking Factors Survey Search Engine Ranking Factors 2015

 

Google’s Battle for Survival

Google’s Battle for Survival

Over the years, Google has had to change and adapt to survive. It has been in a constant battle with webmasters who are eager to manipulate its SERPs.

 

Since the algorithm is based on real factors and properties of a website, site owners have been trying to identify those factors and use them to their advantage. Whenever webmasters find a competitive advantage (sometimes called a loophole), Google tries to quickly plug it.

 

Here's a typical example of this "cat and mouse game" that has been going on between Google and webmasters over the years:

 

Over a decade ago, webmasters found out that Google used the Meta Keyword tag as a major ranking factor. What did they do? They began stuffing this tag with keywords in an attempt to rank well for those terms. What did Google do? They started to ignore the Meta Keyword tag, effectively closing that loophole.

 

I would like to point out that I do believe Google still looks at the Meta Keyword tag, but not as you might think. I think the company uses it to help identify spammers. Any page that has a Meta keyword tag stuffed with dozens, or even hundreds, of keywords, is clearly doing something underhand or at least trying to.

 

Here is another example of a loophole being closed.

 

A few years ago, webmasters found out that by using a domain name that essentially consisted of nothing more than the keyword phrase they wanted to rank for, the site would get a massive ranking boost in the SERPs.

 

This type of domain is called an Exact Match Domain (EMD). We’ll look at EMDs later.

 

Anyway, in September 2012, Google released the “EMD Update” which removed that unfair ranking advantage. Hundreds of thousands of EMD sites dropped out of the Google top 10 overnight, which saw an end to a large industry that had profited in buying and selling EMDs.

 

Since an EMD is usually a commercial search phrase, most inbound links to these EMD sites also contained that exact commercial term. This over-optimization for a commercial phrase was bad news for the site owners. This was because Google’s Penguin was on the lookout for this exact type of over-optimization.

 

Today, EMD sites are rarely seen in the top 10 for any mildly competitive search terms.

 

The battle between spammer and search engine continues to rage on to this today. Spammers find loopholes, and Google plugs them.

 

In September 2011, Google CEO Eric Schmidt said that they had tested over 13,000 possible algorithm updates in 2010, approving just 516 of them. Although 516 may sound a lot (it’s more than one update a day), it certainly wasn’t an unusual year.

 

Panda, Penguin, and Other Major Updates

We could go back to the very beginning to see all the changes and updates that Google have made, but I want to focus on those changes that have happened since 2011. These are the ones that have changed the way we do SEO today.

 

Google Changes in 2011

This was a huge year in SEO terms, shocking many webmasters. In fact, 2011 was the year that wiped a lot of online businesses off the face of the web. Most deserved to go, but quite a few innocent victims got caught up in the shakeup too, never to recover.

 

At the beginning of the year, Google tried to hit scraper sites (sites that used bots to steal and post content from other sites). This was all about trying to attribute ownership of content back to the original owner and thus penalize the thieves.

 

On February 23, the Panda update launched in the USA. Panda was essentially targeting low-quality content and link farms. Link farms were basically collections of low-quality blogs that were set up to link out to other sites.

 

The term "thin content" became popular during this time; describing pages that really didn’t say much, and were there purely to host adverts. Panda was all about squashing thin content, and a lot of sites took a hit too.

 

In March of the same year, Google introduced the +1 button. This was probably expected to bear in mind that Google had confirmed they used social signals in their ranking algorithm. What better signals to monitor than their own?

 

In April 2011, Panda 2.0 was unleashed, expanding its reach to all countries of the world, though still just targeting pages in English. Even more, signals were included in Panda 2.0. They probably included user feedback via the Google Chrome web browser. Here users had the option to “block” pages in the SERPs that they didn’t like.

 

As if these two Panda releases were not enough, Google went on to release Panda 2.1, 2.2, 2.3, 2.4, 2.5, and 3.1, all in 2011. Note that Panda 3.0 is missing.

 

There was an update between 2.5 and 3.1, but it is commonly referred to as the Panda “Flux”. Each new update built on the one previous, helping to eliminate still more low-quality content from the SERPs.

 

With each new release of Panda, webmasters worried, panicked and complained about forums and social media. A lot of websites were penalized, though not all deserved to be; unavoidable "collateral damage" Google casually called it.

 

Another change that angered webmasters was “query encryption”, introduced in October 2011. Google said they were doing this for privacy reasons, but webmasters were suspicious of their motives.

 

Prior to this query encryption, whenever someone searched for something on Google, the search term they typed in was passed on to the site they clicked through to. That meant webmasters could see what search terms visitors were typing to find their pages using any web traffic analysis tool.

 

Query encryption changed all of this. Anyone who was logged into their Google account at the time they performed a search from Google would have their search query encrypted.

 

This prevented their search terms from being passed over to the websites they visited. The result of this was that webmasters increasingly had no idea which terms people were using to find their sites or site pages.

 

In November 2011, there was a freshness update. This was to supposedly reward websites that provided time-sensitive information (like news sites), whenever visitors searched for time-sensitive news and events. As you can see, there was a lot going on in 2011, but it didn't stop here.

 

Google Changes in 2012

Google Changes

In October of this year, Google announced that there were 65 algorithm changes in the previous two months.

On October 5, there was a major update to Penguin, probably expanding its influence to non-English content.

 

Also in October, Google announced the Disavow tool. This was Google’s answer to the “unnatural links” problem. They completely shifted the responsibility of unnatural links onto the webmaster by giving them a tool to disavow or deny any responsibility or support for those links.

 

If there were any external links from bad neighborhoods pointing to your site, and you could not get them removed, you could now disavow those links, effectively rendering them harmless.

 

Finally, in October 2012, Google released an update to their “Page Layout” update, and in December, they updated the Knowledge Graph to include non-English queries in a number of the more popular languages. This drew an end to the Google updates for that year.

 

[Note: You can free download the complete Office 365 and Office 2019 com setup Guide for here]

 

Google Changes in 2014

Google updated their page layout update

Anyone who builds a website will want it to rank well in Google.

Without having a high-profile presence in the SERPs, a site won’t get much traffic, if any at all. If that happens, webmasters WILL try to boost their rankings, and the traditional way is by working on the “on-page SEO” and inbound links.

 

However, over the last couple of years, in particular, Google has introduced measures that aim to penalize any webmaster who is actively trying to boost rankings via traditional SEO methods.

 

Google wants the best pages to rank at the top of the SERPs for obvious reasons. So Google rewards the pages that deserve to be at the top, rather than pages that webmasters force to the top using SEO (much of which Google collectively calls “webspam”).

 

What this means to you is that you have to deliver the absolute best quality content you can. You need to create content that deserves to be at the top of the SERPs. You know, content is not only King now, but it has always been King.

 

The difference now is that Google algorithms are so much better at determining what constitutes great content and what doesn't. In other words, it's no longer easy to take shortcuts and use underhanded tactics to trick the algorithms as it once was.

 

Fortunately for you, Google offers a lot of advice on how to create the type of content they want to show up in their SERPs. In fact, they have set up a web page called “Webmaster Guidelines”.

 

This tells you exactly what they want, and just as importantly, what they don’t want. We’ll look at this shortly, but first, let’s see how we used to create content in the past.

 

Google Algorithm Changes in 2015

Google Algorithm Changes

Where Do All of These Updates Leave Us?

Today, optimizing for specific keyword phrases has not just become difficult because of Panda & Penguin. It has now become less effective in driving traffic if those phrases do not match the intent of the searcher who is typing into Google.

 

Out of all the aforementioned Google updates, the one that had a big influence on my own SEO was Penguin. Google sent their Penguin into places that no SEO professional ever imagined they would dare to send it. Many will remember April 24, 2012, as the day the world of SEO changed forever.

 

It was also the day that inspired me to release the first edition of this blog, entitled “SEO 2012 & Beyond – SEO will never be the same again”.

 

Google created such a major shift in the way they analyzed web pages that I now think in terms of Pre-Penguin and Post-Penguin SEO, and that will likely come across in this blog.

 

SEO Today in 2019

On-page SEO

Today, SEO is very different from even a couple of years ago. Google has put a number of measures in place to combat the manipulative actions of webmasters.

 

Ask a bunch of webmasters to define the term SEO and I bet you’ll get a lot of different answers. Definitions will vary depending on the type of person you ask, and even when you ask them.

 

SEO before Google introduced the Panda update was easy. After the Panda update, it was still relatively easy, but you needed to make sure your content was good. After Google released the Penguin update, SEO suddenly became a whole lot harder.

 

In 2019, phrases you’ll hear being used by SEO experts will include:

  • On-page SEO
  • Off-page SEO
  • Link building
  • White hat SEO
  • Grey hat SEO
  • Blackhat SEO

 

White Hat SEO – approved strategies for getting your page to rank well. Google offers guidelines to webmasters which spell out approved SEO strategies.

 

Black Hat SEO – these are the “loopholes” that Google are actively seeking out and penalizing for. They include a whole range of strategies from on-page keyword stuffing to backlink blasts using software to generate tens of thousands of backlinks to a webpage.

 

you are more likely to be safer with Google. Google’s tolerance level is somewhere in the grey hat area. Staying somewhere in the middle may give you better returns for your time, but you do risk a penalty eventually. Before Panda & Penguin, most webmasters knew where these lines were drawn and took their risks accordingly.

 

When Google introduced Panda, the only real change was that webmasters needed to make sure their website content was unique, interesting to visitors, and added something that no other webpage on the same topic provided. No small task, but to beat Panda, the goal was to create excellent content.

 

When Google introduced Penguin, they completely changed the face of SEO, probably forever, or at least for as long as Google continues to be the dominant search engine (which probably amounts to the same thing).

 

We’ve still got the safe “White Hat” SEO and unsafe “Black Hat”. What’s really changed is the “Grey Hat” SEO. Whereas previously it was reasonably safe, it’s now become a lot riskier.

 

This tolerance line can move from left to right at any time, depending on how Google tweak their algorithm. If they want to come down hard on “spammers”, they’ll move the line more to the left.

 

If too many good sites get taken out as “collateral damage”, as they call it, then they may move the tolerance line over to the right a bit (see the section later on “Trust Vs No-Trust”).

 

Generally though, for all new sites and most others, the tolerance level is very close to the White Hat boundary, and this is what you need to be mindful of. Webmasters who use techniques which are further to the right of this tolerance line risk losing their rankings.

 

Although these diagrams are good to work from, they do not display the whole truth. Let’s just consider how trust changes the equation.

 

Trust Vs No-trust

The Google Tolerance Line will slide left or right depending on the site that is being ranked. For a site that has proven its worth, meaning Google trusts it a lot; we might see the diagram look like this:

 

A "Trusted" Site

Yet for a new site or one that has no track record to speak of, the diagram will probably look a lot more like this.

 

A "Non-trusted" Site

spammy SEO

The only difference here is in the location of the “tolerance line”.

 

In other words, Google is a lot more tolerant of sites which have built up authority and trust that they are of new sites or sites which have not been able to attain a decent level of authority or trust.

 

A high authority site with lots of trusts can withstand a certain amount of spammy SEO without incurring a penalty (see later when we discuss “negative SEO”). In other words, the more authority the site has, the more it can endure, or get away with.

 

A new site, on the other hand, would be quickly penalized for even a small amount of spammy SEO.

 

Webmasters Living in the Past

A lot of webmasters (or SEO companies vying for your business) may disagree with my take on modern-day SEO, and that’s fine. The more people there are who just don’t get it, the less competition there is for me and my clients.

 

I am sure you can still find people who will say this is all rubbish and that they can get your pages ranked for your search terms (depending on the severity of competition of course) by heavily back-linking the page using keyword rich anchor text. 

 

The process they’ll describe is eerily similar to the process I told you about in the section “How we used to rank pages”, i.e. before 2010. It goes something like this:

 

  • 1.Keyword research to find high demand, low competition phrases.
  • 2. Create a page that is optimized for that keyword phrase.
  • 3. Get lots of backlinks using that keyword phrase as anchor-text (the clickable text in a hyperlink).
  • 4. Watch your page rise up the SERPs.

 

If you don’t care about your business, then follow that strategy or hire someone to do it for you. You might get short-term gains, but you’ll run the risk of losing all your rankings further down the line when Google catches up with you (and catch up it will… probably sooner rather than later).

 

To lose all of your rankings on Google does not take a human review, though Google does use human reviewers on some occasions. No, getting your site penalized is much quicker and less hassle than that of human intervention.

 

The process Google created for this is far more “automated” since the introduction of Panda and Penguin. Go over the threshold levels of what is acceptable, and the penalty is algorithmically determined and applied.

 

The good news is that algorithmic penalties can just as easily be lifted by removing the offending SEO and cleaning up your site. However, if that offending SEO includes low-quality backlinks to your site (especially to the homepage), things become a little trickier.

 

Remember the SEO expert you hired that threw tens of thousands of backlinks at your site using his secret software? How can you get those backlinks removed? In most cases, it's not easy, although Google does provide a “Disavow” tool that can help in a lot of instances.

 

I’ll tell you more about that later in the blog. In extreme circumstances, moving the site to a brand new domain and starting afresh may be the only option.

 

In the rest of this blog, I want to focus on what you need to do to help your pages rank better. I will be looking mainly at white-hat strategies, though I will venture a little into grey hat SEO as well. I won’t be covering black-hat SEO at all though. This is because it's just not a long-term strategy.

 

Remember, with the flick of a switch, Google can, and sometimes does move the goal posts, leaving you out in the cold. Is it worth risking your long-term business plans for short-term gains? OK, let’s now get started with the four pillars of post-Penguin SEO.

 

The Four Pillars of Post-Penguin SEO

SEO strategies

I have been doing SEO for over 10 years now and have always concentrated on long-term strategies. That’s not to say I haven’t dabbled in black hat SEO because I have, a little.

 

Over the years, I have done a lot of experiments on all kinds of ranking factors. However, and without exception, all of the sites I promoted with black hat SEO have been penalized; every single one of them.

 

In this blog, I don’t want to talk about the murkier SEO strategies that will eventually cause you problems, so I’ll concentrate on the safer techniques of white hat SEO.

I have divide SEO into four main pillars. These are:

  • 1.Quality content
  • 2.Site organization
  • 3.Authority
  • 4. What’s in it for the visitor?

 

These are the four areas where you need to concentrate your efforts, so let’s now have a look at each of them in turn.

 

Quality Content

Quality Content

Before we begin, let me just say that a lot of SEO “experts” have disagreed with me on the need for quality content as a major ranking factor.

 

They will often cite exceptions in the search engines where pages rank well for competitive terms without much on-page content or with the content of very little value. The fact that these exceptions rarely last more than a week or two eludes them.

 

Remember, bounce rates and time spent on a site are indicators to Google of whether a page gives a visitor what they want, or not. Poor content cannot hide in the Google top 10. Search engine users essentially vote out content that falls short on quality. They do this by their actions, or inactions, as the case may be.

 

They vote by the length of time they stay on any particular site or page, and whether or not they visit any other pages while on that site (the latter examining overall site reputation). Additionally, Google also looks at what they do once they hit the back button and return to the SERPs. It’s quite an eye-opener.

 

Make <title> and ALT attributes descriptive

ALT attributes descriptive

The title tag of your page is one of the most important areas in terms of SEO. Try to get your most important keyword(s) in there, but do not stuff keywords into the title.

 

The title tag has two main objectives; one is to entice the searcher to click through from the search engines, and the other is to tell Google what your page is all about.

 

We talked about click-through rates (CTR) earlier in the blog and looked at how important a factor they are in modern SEO. If you find that a page has a low CTR (something you can check in Google webmaster tools).

 

Then tweak the title to see if you can make it more enticing to searchers, and then monitor the CTR of the page in the coming days and weeks. Just know that better titles (ones that match the searcher’s intent) will attract more clicks in the Google SERPs.

 

We talked about click-through rates (CTR) earlier in the blog and looked at how important a factor they are in modern SEO. If you find a page has a low CTR (something you can check in Google webmaster tools), then tweak the title to see if you can make it more enticing to searchers.

 

You then need to monitor the CTR of the page in the coming days and weeks. Just know that better titles (ones that match the searcher’s intent) will attract more clicks in the Google SERPs than ones that don't.

 

NOTE: A page listed in the search results also contains a short description under its title. Google sometimes uses the text from the Meta Description tag of your web page for this.

 

The description is typically a 360-character snippet used to summarize a web page's content. Search engines like Google may use these text snippets in the search results as a way to let visitors know what your page is about.

 

You can do the same tweaking of the Meta Description tag as you do with the title tag to help increase CTR. However, this is less of an exact science since Google will sometimes grab text extracts from elsewhere on the page for the listing’s description.

 

Let's now look at ALT tags. These are there for a very specific purpose. Their primary objective is to help people who have images turned off, for example, people with impaired vision. Often they will use text-to-voice software to read pages, so your ALT tags should help them with this.

 

An ALT tag should describe the image to those users in a concise way. It is also a great place to insert a keyword phrase or synonym, but NEVER use ALT tags just as a place to stuff keywords.

 

How to stay ahead of Google’s updates

Every now and then, Google releases a significant update to their algorithm, which can have a massive impact on businesses from any industry. To hone your SEO chops and make sure your site doesn't fall into Google's bad blogs, it's important to stay informed of Google’s updates as they are released.

 

Fortunately, almost every time a major update is released, those updates are reported on by the entire SEO community and sometimes publicly discussed and confirmed by Google staff.

 

A long-extended history of Google’s updates would fill this entire blog, but with the resources below, you can stay abreast of new Google updates as they are rolled out.

 

This is essential knowledge for anyone practicing SEO, at a beginner or an advanced level. You can even keep your ear to the ground with these sources and often be forewarned of future updates.

 

Google Updates by Search Engine Round Table Google PageRank & Algorithm Updates

Search Engine Round Table is one of the industry’s leading blogs on SEO. On the page above, you can browse all of the latest articles on Google updates by a leading authority on the topic.

 

Search Engine Journal

Actionable SEO Tips and Strategies That Work | SEJ

Search Engine Journal is another authoritative, relevant and frequently updated publication about everything SEO. An indispensable resource for keeping abreast of industry events as they happen.

 

How to find keywords for easy rankings

How to find keywords for easy rankings

Now you need to find out how competitive your desired keywords are. Armed with an understanding of the competitiveness of your keywords, you can discover keywords you can realistically rank for in Google.

 

Let’s say you are a second-hand bookseller and you want to target “blog store online”. It's unlikely you are going to beat Amazon and Barnes and Noble. But, maybe there’s a gem hiding in your keyword list few people are targeting— maybe something like “antique blog stores online”.

 

Test My Site - Think With Google Test Your Mobile Website Speed and Performance

Test My Site Think With Google

Around late June 2017 Google updated their mobile load speed testing tool, Test My Site, to include benchmarking reports against industry competitors. This tool is both easy-to-use and indispensable for finding easy-win load speed improvements for mobile users—and handy for seeing how your website performs against competitors.

 

You might be shocked the first time you use this tool—many site owners discover they are losing up to 30%-50% of traffic, due to poor loading time on 3G mobile devices, not a great outlook.

 

Fortunately, the handy tool provides free reports and actionable recommendations on how to supercharge your load speed with a strong focus on improvements for mobile users.

 

If you follow the recommendations and get your site performing better than competitors, you can make out like a bandit in the search results, with load speed being a top ranking factor driving the search results.

 

Pingdom Tools – Website Speed Test http://tools.pingdom.com/

Pingdom Tools

Pingdom Tools Website Speed Test is the cream of the crop when it comes to loading speed tools, providing detailed breakdowns of files and resources slowing your site down, listing file-sizes of individual files, server load times, and much more.

 

It goes into much greater depth than the other tools, though probably best suited for a web developer or someone with a basic level of experience building websites.

 

After the test is completed, if you scroll down you will see a list of files each visitor has to download each time they visit your site. Large images are easy targets for load speed improvements.

 

If you have any images over 200kb, these can usually be compressed in Photoshop and shrunk down to a fraction of the size without any quality loss.

 

Take note of any large files, send them to your web developer or web designer, and ask them to compress the files to smaller file size. The usual suspects—sitemaps.xml and robots.txt

 

Mobile friendly Test Mobile-Friendly Test - Google Search Console

Mobile friendly Test

 

Use clean code in your site

There’s a surprisingly high amount of sites with dodgy code, difficult for both search engines and Internet browsers to read.

 

 If there are HTML code errors in your site, which means, if it hasn’t been coded according to industry best practices, it’s possible your design will break when your site is viewed on different browsers, or even worse, confuse search engines when they come along and look at your site.

 

Run your site through the below tool and ask your web developer to fix any errors.

Web standards validator https://validator.w3.org/

 

Take it easy on the popups and advertisements

Sites with spammy and aggressive ads are often ranked poorly in the search results. The SEO gurus have reached no consensus on the number of ads leading to a penalty from Google, so use your common sense.

Ensure advertisements don’t overshadow your content and occupy the majority of screen real estate.

 

Improve the overall “operability” of your site

improve search results

 

Does your site have slow web hosting or a bunch of broken links and images? Simple technical oversights like these contribute to poor user experience.

 

Make sure your site is with a reliable web hosting company and doesn’t go down in peak traffic. Even better, make sure your site is hosted on a server in your local city, and this will make it faster for local users.

 

Next up, chase up any 404-errors with your web developer. 404 errors are errors indicating users are clicking on links in your site and being sent to an empty page. It contributes to poor user experience in Google’s eyes. Fortunately, these errors are easily fixed.

 

You can find 404 errors on your site by logging into your Google Search Console account, clicking on your site, then clicking on “Crawl” and “Crawl Errors”. Here you will find a list of 404 errors. If you click on the error and then click “Linked From” you can find the pages with the broken links.

 

Fix these yourself, or discuss with your web developer. Google Search Console https://www.google.com/webmasters/tools/

If you want external tools to speed up improving your site’s usability, I have found these two resources helpful:

 

BrowserStack - Free to try, plans start at $29 per month. https://www.browserstack.com

BrowserStack allows you to test your site on over +700 different browsers at once. You can preview how your site works on tablets, mobile devices, and all the different browsers such as Chrome, Firefox, Safari, Internet Explorer, and so on. It’s helpful for making sure it displays correctly across many different devices.

 

Try My UI - Free to try, additional test results start at $35. http://www.trymyui.com

Try My UI provides videos, audio narration, surveys of users going through your site, and reports on any difficulties they uncover.

 

Usability tests are good for larger projects requiring objective feedback from normal users. The first test result is free, making Try My UI a good usability test provider to start with.

 

Google's Search Quality Guidelines—And How to Use Them to Your Advantage

Google's Search Quality Guidelines

Search quality is an increasingly popular topic in the blogosphere because it can have a massive impact on rankings. Why is this so? Making sure users are sent to high-quality and trustworthy search results is critical for Google to safeguard their position as providing the best all-around search experience.

 

While this sounds a little vague, you can use Google's search quality to your advantage and get an edge over competitors. Did you know that Google publicly published their “Search Quality Evaluator Guidelines”, updated on July 27th, 2017? If you didn't, well now you do.

 

The document's 160-pages long, so presuming you don't consider a dense whitepaper leisurely reading, I'll list out the most important and actionable takeaways, so you can use them to your advantage.

 

Search Quality Evaluator Guidelines—Key Takeaways

Search Quality Evaluator Guidelines

1. Real name, company name, and contact information listed on an about page.

If you don't have this information listed on your website, why should Google, or anyone else for that matter, trust you? Better make sure you include it.

 

2. Excessive and unnatural internal structural links across sidebars and footers. If you've got 150-links in your footer, it's obvious to Google you're trying to do something sneaky, so be conservative with the footer and sidebar links. Keep it restricted to the most important pages on your site or what's useful for your users.

 

3. Over the monetization of content. Specifically, if you are disguising advertisements as main content, or your advertisements occupy more real-estate than the main content than one of Google's search evaluators will probably flag your site as spam. Take a common-sense approach with your ads, don't overdo it!

 

4. List editors & contributors. Are you publishing a bunch of articles under pseudonyms or generic usernames? Listing editors and contributors, i.e. real people, is more trustworthy and will increase the perceived quality of your page.

 

5. Provide sources. Publishing generic articles en masse without any reputable sources? You'll get a better-quality assessment, and a higher ranking, if you list sources for your articles.

 

Listing sources show the writer has performed diligence in their research and increases the credibility of the page.

 

6. Financial transaction pages. All you drop-shippers and e-commerce retailers out there stand up and take note—pages associated with financial transactions (shopping cart, checkout, product pages, etc.) must link to policy pages for refunds, returns, delivery information, and the terms and conditions of your site.

 

Think about it from the user's perspective, if you are average Joe shopper thinking about buying something and the page doesn't list any of this information, how safe would you feel checking out?

 

7. Pages offering financial information must be of the highest quality. Google are stricter with these types of pages, as it falls into their “Your Money or Your Life” category—meaning it could affect the financial well-being of the user.

 

If you're publishing this kind of content, make sure you're doing everything you can to provide high-quality, detailed articles, citing sources, fully disclosing financial relationships, and making it clear what author or company is behind the content.

 

That sums up the most important takeaways from the Google Search Evaluator Guidelines. If you haven't got them in your site, work ‘em in and you'll get a leg up over your competitors, or worse, your rankings could suffer.

 

And if you really, really want to sit down and read through the 160-page whitepaper on page-quality assessment, here it is for your enjoyment.

 

Search Quality Evaluator Guidelines - July 27th, 2017 https://static.googleusercontent.com/media/www.google.com/en//inside

Recommend