120+ SEO New Hacks for Blogging (Complete Guide 2019)

SEO

Top 120+ SEO New Hacks Tips for Blogging in 2019

Over time search engines became more efficient at judging sites based on user intent and weeding out sites that were using deceit to rank higher on search engine result pages (SERPs).

 

Today you cannot get away with spamming the search engines— Google and the others have become adept at knowing which sites are adhering to their guidelines. This blog explains the 120+ tips for local SEO and SEO New hacks for blogging.

 

However, many professionals used black-hat SEO techniques (such as link farming and stuffing pages with keywords) to gain higher rankings. Using underhanded techniques may have elevated sites in the rankings initially.

 

But the sites were penalized later by the search engines. The deceptive techniques were rank-specific and did not take website users or customers into consideration.

Top 120 SEO Tips

Studies suggest that search engines consider more than 250 factors when ranking sites. Although the exact attributes that result in better rankings are not specified (they are a business secret), the fundamentals have shifted toward an enhanced user experience (UX) and providing meaningful content.

 

“Content is king” is an adage you may have heard a million times, but the scenario has changed: “Relevant content is king” is the new mantra and is an apt motivator toward a streamlined UX. You should focus on user intent and user satisfaction rather than design sites for the search engines.

 

SEO is an amalgam of relevance and best practices designed to help users find information related to their queries. This blog-Section looks at on-page, on-site, and off-page SEO factors that form the crux of SEO.

 

On-Page SEO

On_page_seo

 

On-page optimization is related to factors controlled by you or your code that have an effect on your site’s rankings in search results. To create an optimal experience, you need to focus on the following page-optimization factors:

  • Title tags
  • Meta keywords and meta descriptions
  • Headings
  • Engaging content
  • Image optimization
  • Interactive media
  • Outbound and internal link

 

Title Tag Optimization

Title Tag

In a page’s HTML code, the <title> tag gives an indication of the page content. (You can find the words used between the <title> tags by viewing the page source in Firefox; the method varies depending on the browser.)

 

Search engines display the page title as the title of the search snippet link on SERPs. However, the title of a search snippet may also depend on the search query.

 

Over time, depending on the links and their age, a search snippet may change dynamically for the same result. Thus there is no fixed rule for how to create a page title. The title may also vary depending on the platform—especially for responsive websites on small, medium, and large screens.

 

Do not stuff keywords into a page title, because Google may penalize your site for manipulating the natural search process. Also, avoid using irrelevant phrases or a single keyword in the page title.

 

Your page title should educate users about the page content and thus must be relevant and related to the page content. Single keywords face a lot of competition because thousands of websites use the same keyword;

 

It is better to use long-tail terms, which may be a mix of keywords and related phrases. Also, keep in mind that each page on the website should have a unique title.

SEO experts

The best practice, according to SEO experts, is to use a phrase containing relevant words (say, 8–11 words) with at most 55–65 characters. This makes sense because extremely long phrases will not work well on mobile devices where space is a constraint. Titles must be precise and concise and can use a mix of uppercase and lowercase characters.

 

Avoid commonly used titles or duplicate content, because search engines display a preference for unique titles. Google Search prefers function over form, so it is a best practice to use a simple, unique title rather than a sensational, irrelevant title.

 

You should understand and consider user intent rather than looking at titles from a search engine point of view.

 

Meta Keywords and Meta Descriptions

Top 120 SEO Ranking Tips and Tricks

Recently, Google confirmed that it doesn’t consider meta keywords and descriptions as ranking factors. Nevertheless, meta keywords and meta descriptions are cached, so it would not be a best practice to ignore them.

 

Although they are not consequential in determining search engine results, meta descriptions can be an excellent way of advertising; they may or may not be displayed in the search results. It is a good practice to limit the meta description to 300–320 characters.

 

It provides a preview of the content or information on that page and should contain the gist of the entire page. If the description is apt, informative, and meets the needs of the user, it may work like free advertising: the user may be compelled to click that site link to view the content.

 

The meta description must be unique for each page on a website, like a page title. Avoid stuffing the description with keywords, and remove all special characters. Using multiple meta keywords can have a negative influence on search engines.

 

The meta robots attribute is increasingly being used by web designers. It tells crawlers whether the page should be displayed in SERPs (index/noindex) or whether you should depend on the links on the page (follow/nofollow).

 

Heading Tags (h1, h2, h3, h4, h5, and h6)

Heading

Heading tags are an important on-page factor. The <h1> (heading 1) tag is crucial and must be relevant to the topic discussed on the web page. It educates readers about the topic on that page.

 

Instead of filling a page with clutter, it is a good practice to stick to a single topic; the heading 1 tag is imperative because it indicates the page’s topic. Use relevant words in the heading to help users and also spiders understand the page’s content. Google adheres to text semantics and emphasizes its use for better results.

 

Avoid skipping heading levels on a web page. <h1> should be followed by <h2>, which in turn may have a <h3>, and so on. You may have multiple <h2> tags or subsequent tags if needed. Your web page must display a systematic pattern or consistency. If the formatting or styling of the headings is not to your liking, you can use CSS styling to alter it.

 

Include keywords, but do not repeat them in the heading. Keywords used at the beginning of a heading yield better results. Avoid spamming or using irrelevant words in headings, because doing so may have a negative effect.

 

Engaging Content

content

Using meaningful and pertinent content in the body section of the site is vital. Relevant content is king. The content should not be irrelevant or stuffed with keywords—the search engines may penalize you for it. However, you can use keywords or close variations of them twice or three times on a page in a logical way.

 

The content should be informative and engage the user, encouraging them to return to check out the site regularly. It is a good practice to update the content (such as technology topics) at least every six months because Google has a penchant for updated or fresh content. (News channel sites update their content on a daily basis.

 

Here, we are referring to product pages or informative sites, and updating or adding content for a product or topic.) Blogs must be updated on a regular basis. Use interactive media such as images, videos, and audio files on your web pages;

 

They are intuitive and engage users, and may make the site more popular. Always spell-check and proofread your content, because incorrect grammar or spelling errors can reflect negatively on your site.

 

In addition to having meaningful content, the number of content matters. You cannot use keywords 3 times in 140 characters—that is keyword stuffing. In-depth, detail-oriented, relevant content helps you space out keywords evenly.

 

It also helps users to understand the logic of the content, especially if the topic is informative and educates the user significantly.

 

However, do not use 2,000 words just to fill the page; low-quality content results in bad UX. Remember, less is more, because the quality is more important than quantity—function over form.

content

Bounce rate reflects the number of users who visit a web page and then leave. It doesn’t matter how much time they spend on the page; it focuses on whether users leave the site after viewing just one page. Low-quality content results in higher bounce rates and will eventually affect the site’s visibility.

 

Do not copy content from another website or use the boilerplate content. Google search engines have been known to penalize sites that use duplicate content. Focus on user satisfaction and not on fooling the search engines.

 

At times there are legitimate reasons for duplicate content: for example, an e-commerce site will have the same content on different pages with different URLs due to filters such as size, color, and price.

 

Some websites have the same content on different web pages with the prefixes HTTP and HTTPS; although the rest of the URLs are the same, the prefixes mean they are treated as separate pages. Sometimes the watered-down mobile version of a website has the same content as the desktop version, resulting in duplication.

 

Localization may also be a factor: for example, www.google.com may appear as www.google.co.in for India. The content may be the same, but the URLs are different. In such cases, search engines may not allocate as high a ranking, because two different URLs have the same or similar content.

 

You can resolve these issues by using either a canonical tag or a 301 direct. A 301 redirect is a permanent redirect from one URL to another that helps users reach the new address. It can also be used for “404 Page not found” errors where content has been moved to a different web page.

 

A canonical tag is an alternative where you apply a rel=canonical attribute to tell search engines the original or preferred content and the URL to be indexed for display in SERPs.

 

For example, suppose these two websites have the same content: http://example97653.com and http://example234.com/seo12345/56473.

 

The first URL is the original, getting the maximum number of hits. You want this site address to be indexed. To implement the canonical tag, you go the HTML code for the second URL and, in the <head> element, add the following: 

<link    rel="canonical" href="http://example97653.com"/>

 

You use the canonical attribute in the head element of the HTML markup for the URL containing the duplicate content and link it to the original or preferred URL.

 

Image Optimization and Interactive Media

Interactive Media

Earlier SEO was text-based, but this has changed significantly. You should use interactive media such as audio, video, images, and infographics to connect with your users. Use captions and alternate text for media, and build relevant content around these media. You can use a single key phrase in the alt text if it is relevant to that image.

 

You can interchange images based on the screen size, with heavy-duty images for desktop sites and lightweight images for mobile sites. Try to limit the image file size to less than 80–90 KB for optimal page-loading time. 

 

Use PNG or JPEG image formats wherever possible, because they are robust and have more visual properties. Using thumbnails and different angles for a product can be very handy, especially on e-commerce sites.

 

Using videos explaining a product or marketing a certain entity is a good practice. Google owns YouTube, and it can be a game-changing means of branding your product. Infographics are an excellent way to provide information or create timelines with relevant content.

 

Outbound and Internal Links

seo_links

Internal links are a key feature of SEO. These are links on web pages that point to another page in the site or domain. 

 

SEO-related research suggests that no page on your website should be more than three clicks from the home page, meaning all pages should be easily accessible. You can use relevant anchor text to point to different pages on your site.

 

Breadcrumbs are an efficient way to provide site navigation using links. Having a good link structure makes it easy for search engines to crawl your entire website, and easy accessibility also leads to an awesome UX.

 

Outbound links point to another domain or site. They are a good feature for informative or niche topics. Sometimes a page includes jargon or topic-specific terms; instead of wasting time explaining supplementary information on the page, you can use URLs or anchor text as outbound links to locations that explain the information in depth.

 

SEO experts tend to doubt the content found on Wikipedia, but it is actually an excellent source of free, relevant, detail-oriented information. For example, suppose you are explaining web servers, and you use the word server in your content.

 

Instead of explaining what a server is, you can use the word as anchor text to link to a wiki site that explains the meaning and use of servers. Linking specific terms to wiki sites such as Wikipedia and Webopedia may boost your SEO process. Not only is doing so relevant, but it also lends a certain amount of trust and credibility to your site.

 

You can use outbound links to social media sites or blogs to help reach out to a larger audience. Just be sure you do not link to spammy or illegal sites—doing so may negate your SEO efforts, because search engines will penalize your site.

 

Also do not link to sites that are not relevant to the topic, because two-way linking or link farming can be detrimental.

 

On-Site SEO

seo

Whereas on-page SEO is relevant to individual pages, on-site features affect your SEO process on the website as a whole. This section explains the following

  • URL optimization
  • Sitemaps
  • Domain trust
  • Localization
  • Mobile site optimization and responsive websites
  • Site-loading speed or page-load time

 

URL Optimization

URL Optimization

URLs play an important role in SEO, and you need to plan holistically for making your URLs as SEO-friendly as possible. Each URL should be human-readable and not consist of a bunch of special characters or numbers mixed with words. It should be meaningful and should reflect what the site is about.

 

For example, https://www. searchenginejournal.com is meaningful because it tells readers that the site offers a comprehensive array of topics and guides related to SEO.

 

Using hyphens (-) instead of underscores is a good practice recommended by GoogleSEO experts advocate the use of canonical tags or 301 redirects for duplicate pages or pages with similar content; otherwise, the value of the content may be negated, because as ranking signals may split it across the multiple URLs.

 

For “404 Page not found” errors, you need to use 301 redirects to guide users to a working URL for that content.

 

Using a robots.txt file helps inform search engines about pages to be ignored while crawling the site. For example, a Contact Us page or About Us page may be useful if users need to access or buy a product or need help from the customer service department.

 

However, a normal user may not find the Disclaimers page important and hardly skim such pages. So, it is essential to educate crawlers about which pages need to be indexed for SEO processes. You can also indicate broken links and 404 pages in the robots.txt file.

 

SEO experts advocate the use of a favicon on the title bar next to the URL because it lends credibility and helps with effective branding. It helps users recognize your site and improves trustworthiness significantly. Although there are no direct benefits from favicons from an SEO perspective, they enhance usability.

 

Bookmarks in the Google Chrome browser send out signals to the Google search engine that maps bookmarks for sites on the Web. This it is not a major factor, but it certainly helps from a user perspective.

 

Site Maps

Sitemap_SEO Ranking Tips and Tricks

There are two types of sitemaps: XML sitemaps, which are tailored to search engines; and HTML sitemaps, which are directed toward users. An XML sitemap contains a machine-readable list of pages on your site that you want search engines to index for SEO purposes.

 

It contains information for crawlers such as the last update, its relevance or importance, alterations, and related data. XML sitemaps are domain-related and help spiders perform a deep search of web pages.

 

For example, issues such as broken links or a lack of internal linking can be crucial factors that may result in crawlers not being able to index pages. There is no guarantee that a sitemap will cause crawlers to index all of your website’s pages; however, it will significantly help with accessibility, because search engines can digest such data easily.

HTML sitemap

An HTML sitemap is tailored to your website’s users and helps users locate different pages. All categories and products can be listed explicitly. It streamlines the user experience by making users familiar with your website and provides better semantics. UX is a vital aspect of SEO, so it is a good practice to include both XML and HTML sitemaps in your process.

 

Make sure your XML sitemaps for search engines are exhaustive; on the other hand, HTML sitemaps should be more concise so users can navigate them more easily.

 

Domain Trust and Local Domains

Your domain can be a key ranking factor because it creates trust and credibility for site users. Studies suggest that domains registered for two years or longer were considered more trustworthy than new domains.

 

Use the .com domain extension, because it is more common than .org and other extensions. Domain localization—catering to a specific country or city—may prove to be a game changer.

 

For example, .co.uk caters to the United Kingdom and is more specific to users in that region and those with business links to the UK. Choosing a domain with a good reputation is helpful. If the domain has been assessed some kind of penalty, it can be detrimental to your business due to lack of credibility.

 

Using keywords in a domain name may be useful; however, given all the keywords that have already been used by websites, you may not be able to have the domain name of your choice.

 

Your domain name is crucial because it indicates what your site is all about. Opt for a simpler, unique, relevant domain name rather than a sensational name, to help users connect with your site.

 

You can use an online dictionary to check words related to your service or product. You can also use a combination of two or three words, such as DealOn, ScoutMob, or HomeRun.

 

You can even think outside of the box and come up with something really creative, such as Reddit, Google, and Yelp, to name a few. Again, focus on your prospective customers and come up with something catchy and easy to spell that they can relate to;

 

for example, if you search for plumbers or plumbing in Miami, you see http://www.miami-plumbers.com/ in the results. The name conveys that this is a plumbing business and the region (Miami) where they provide plumbing services.

 

Mobile Site Optimization and Responsive Websites

Mobile Site Optimization

The digital marketing era has seen the rise of smartphones as the preferred option for online purchasing, e-commerce, and finding informative content on the Web.

 

Designers used to create a desktop version and then remove heavy-duty elements to create a watered-down version for mobile devices. But with the advent of online marketing and social media, mobile phones and tablets have gained prominence.

 

Studies suggest that most internet traffic comes through mobile phones and tablets—they have largely surpassed desktop websites. Even web design frameworks such as Bootstrap and Foundation use the mobile-first approach because the target audience has undergone a major shift from desktop users to mobile users.

 

Until recently, designers created two sites: one optimized for mobiles and the other for desktops. It is essential to focus more on mobile site optimization than the desktop version. However, this can be tricky if the mobile version is a stripped-down version with fewer features and less content than the desktop site.

 

Moreover, this means you have two URLs for the same site with similar content, so you need to use the canonical tag. In addition, a watered-down mobile site results in a pathetic UX.

responsive web design

Enter responsive web design: an excellent alternative that uses a single URL for both the mobile and desktop sites. Responsiveness is rated highly by Google. All the features and content of a desktop site are present on the mobile version, meaning there is no compromise on content display; the site is user-friendly and ensures an optimal UX.

 

The bounce rate will be lower because users can get the same information on mobiles as well as desktops. Because there is only one URL, there is no redirect, resulting in faster page-loading times.

 

Because Google highly recommends this approach, responsive web design is here to stay. Currently, Google marks websites as mobile-friendly in mobile searches to help its users identify which websites are likely to work best on their device.

 

Site-Loading Speed

Site-Loading Speed

Site- or page-loading speed is an important attribute because Google and other search engines penalize sites that take a long time to load. An optimal page-load time leads to better conversion and improves the scalability of your products.

 

Pages that take a long time to load may frustrate users and cause negative UX, leading to higher bounce rates. Loss of internet traffic or a bad user experience can damage the site’s reputation.

There are several ways you can improve your page-load speed:

  • Minifying CSS, JavaScript, and other files
  • Minimizing HTTP requests
  • Using an efficient server configuration and good bandwidth
  • Archiving redundant data in the database, and cleaning out trash and spam
  • Using fewer plug-ins and third-party utilities
  • Interchanging data and images, depending on the screen size
  • Avoiding inline styles, and keeping presentation separate from markup
  • Using a content delivery network (CDN)

 

Off-Page SEO

Off_page_seo

 

Whereas on-page SEO and on-site SEO factors are based on the elements and content on your web page or site, off-page SEO factors are external and help you rank higher in SERPs. They are not design or code related and are more like promotional concepts.

 

This section looks at the following.

  • Social media
  •  Blogging
  • Localization and local citations
  • Inbound links

 

Social Media

Social_media

Expand your reach by taking advantage of social media optimization and marketing. Social media is an amazing medium with an ever-increasing scope. You can indulge in networking and increase your connections considerably. Reaching out to the modern target audience is beneficial because users can share and promote your website.

 

Keep your audience engaged, and share updates with them. For example, Facebook and LinkedIn can be awesome utilities that let you expand your business horizons significantly. Share updates and keep your users in the loop using Twitter.

 

You can use the capabilities of these social media sites for branding and advertising for a fraction of the cost of traditional marketing methods such as television advertising, press releases, and Yellow Pages listings. 

 

Blogging

Blogging

Blogging is an excellent tool for achieving user engagement. You can keep users abreast of the latest trends and technologies in your niche. Informative content on blogs acts as supplementary information about your products or services.

 

Troubleshooting steps, product-relevant content, and meaningful information are some of the elements that can be included on a blog. A plethora of blogging domains and tools can help you reach out to your audience. Inbound and relevant links from your blog to your site can boost your SEO implementation significantly.

 

Localization and Citations

Local SEO

Local SEO is an important off-page factor because it caters to the user’s region. It is a boon especially for small- and medium-sized enterprises because it helps them connect with users in their vicinity.

 

Google My Business allows you to list your business and gain prominence in SERPs. You can place your products or services and categorize them so that they show up when a search query is used for that category or niche in the region. Information such as working hours, updates, and contact information can be provided, leading to better accessibility.

 

Local citations are brand mentions or reviews that educate users about product robustness or attributes. Local SEO utilities such as Yelp and Foursquare are extremely helpful for understanding the pros and cons of your products or services, courtesy of user feedback or input.

 

Reviews help you establish a connection with users and understand their viewpoint and concerns related to your business. Increasing interaction with your users will help streamline your business in the long run.

 

Inbound Links

Inbound Links

Inbound links are links from other domains pointing toward your website. Links from domains with high page rank and authority are preferable and lend more credibility than links from domains with low authority or low page rank. The number of domains that link to your website can be a crucial factor.

 

Studies suggest that links from several different domains to your site can boost your SEO implementation. However, you should not indulge in link farming or use underhanded techniques, which may result in a penalty.

 

There should not be too many links from a single domain, because this is an indication of spamming and can have negative consequences. Referral links from blogs, social media sites, and news aggregators are handy, provided they are relevant and contextual.

 

Inbound links from another domain’s home page or key pages are more useful than links from a sidebar or an insignificant page location.

 

Google recommends getting links from domains with high-quality content. Forums, reviews, and comments can contain links pointing to your website and enhance your site presence, provided they are not irrelevant. Backlinks from social bookmarking sites (such as Reddit) and web directories (such as DMOZ) can affect visibility positively.

 

Google Tools Suite

Google

There are many search engines, but Google remains the undisputed leader in the worldwide search industry (barring a few, such as Baidu in China and Yandex in Russia).

 

Google is the dominant engine, controlling more than 70% of the search market share. It is “batteries included” and offers a suite of utilities that can boost your SEO processes.

This blog-Section looks at some of the most powerful Google tools that belong in the SEO expert’s arsenal:

  • Google My Business
  • Google AdWords Keyword Planner
  • Google Trends
  • PageSpeed Insights
  • Google Analytics
  • Google Search Console

 

Google My Business

Google My Business

Google My Business is a new portal for local and nonlocal businesses (local businesses are those present in the vicinity of the user or in the same geolocation). It is a complete ecosystem for local business or brands. It replaces Google Local and Google Places, other than mainstream integration with Google+ (a social networking site by Google).

 

Aimed at small- and medium-sized enterprises that lack the budget of large-scale organizations, Google My Business helps you reach out to consumers. It is a platform that helps you connect to and interact with customers in addition to promoting your business.

 

You can add pictures of your businesses, post reviews, engage with users, share updates and news, and help your business gain visibility in Google search results.

 

The business listings for steel companies in Toronto that resulted from a search query. The boxed listings are prominent, and their location is marked on the map. You also see the operating hours for some businesses and other information such as websites and directions.   

 

Google My Business is a lucrative platform. To begin using it, the first thing you need to do is verify your business. You can do so via a postcard, with a phone call, or by using instant or bulk verification, if your website is verified by the Google Search Console utility (previously called Google Webmaster Tools).

 

If you have registered your business in Google Places or Google+, it will automatically be upgraded to Google My Business. You can update your business details as well as indicate the opening and operating hours.

 

You can fill in the business description, post photos of your brand or website, and reach out to customers in several ways. You can offer discounts, offers, deals, and other promotions to expand your user base.

 

You can interact with consumers by responding to their feeds. Integration with Google+ and Google Hangout enables you to keep your customers in sync and address any complaints as well as communicate with them on a regular basis. In addition, Google Apps for Work can be integrated with this platform.

You manage all this using a single interface.

 

Google AdWords Keyword Planner

Google_adwords

Google AdWords Keyword Planner is a good resource for coming up with keywords for a new project or relevant terms for an existing campaign.

 

You can use this utility to develop terms and filter them for a specific country or region based on the geolocation. You can determine the search volume for keywords and predict the bid for those terms if you are opting for a paid campaign.

 

Google AdWords Keyword Planner helps you choose multiple terms that can be used in conjunction with another term, specifically long-tail terms that can drive the right traffic to your website. You can also find alternate terms for your products or services that have low competition and may require less effort to rank well in SERPs.

 

Monthly global and local monthly searches are handy because you can determine the keywords your competitors are using to promote their products and services. You can find the utility at the following link: https://adwords.google.com/ KeywordPlanner.

 

Google Trends

Google_trends

Google Trends is a utility that helps you compare the flow, trends, and popularity of terms and phrases; you can see at a glance how terms and phrases fare.

 

It spans several categories, such as business, nature, sports, travel, and news. You can streamline your keyword research by using Google Trends results along with the Google AdWords Keyword Planner utility.

 

You can even filter search trends by sorting terms geographically or according to time span and categories. For example, you can determine the popularity of terms in a particular country, region, or city.

 

In addition to finding regional trends, you can use Google Trends for content creation and content development. For example, you can write blogs or include trendy terms in forums and social bookmarking sites.

 

Or you can base content on popular terms or phrases in your new campaigns. You can gain insight into your competitors’ trends and compare your business trends with those of your rivals. You can access Google Trends at https://www.google.com/trends/.

 

PageSpeed Insights

PageSpeed Insights

Google stresses the importance of web page performance, and its algorithms favor websites with optimal site speed and user experience (UX) design. Optimal Page-loading time counts toward an enhanced UX. Studies suggest that optimal page-load time leads to more conversions, thereby affecting sales significantly.

 

Page speed depends on several factors ranging from web-hosting facility and website design to the hardware used for that site. In particular, websites that take a lot of time to open on mobile devices are penalized, because they result in a bad UX.

 

Google has its own tool that you can use, not only for site speed but also for other UX factors. You can access the PageSpeed tool at https://developers.google.com/speed/pagespeed/insights/. 

 

Enter a URL or web page address in the search box, and click Analyze. You will see results for mobile and desktop versions. The results include two sections: Speed and User Experience.

 

Fixes, as well as suggestions pertaining to speed and UX, are listed. Fixes include compression of JavaScript and CSS, avoiding plug-ins for platform compatibility, using legible font sizes, and avoiding landing page redirects.

 

Suggestions are recommendations to further enhance the website for an optimal UX. User satisfaction is a prime factor for any business, and it applies to SEO too. Good site speed leads to backlinks, conversion optimization, and favorable reviews, thereby streamlining the UX.

 

Google Analytics

If you have a website, then you want to know how many people are viewing or buying stuff on that site. You also want to know the geographic location of your users and which content is frequently accessed. Whether your campaigns lead to sales conversions and high traffic can be a deciding factor, especially for an e-commerce site.

 

Enter Google Analytics: an enterprise-grade toolkit that helps you gain insight into your website traffic, historical trends, and site statistics. You can find out about pages that have poor traffic and bounce rates and get information about the search query keywords used most often to reach your website.

 

You can learn whether you need a mobile site or if you already have one, how to optimize it to gain relevant traffic. Traffic trends and links from referral sites can be measured for conversion optimization. You can set custom goals and events to get meaningful information to streamline your business processes.

 

Tracking metrics and visitor engagement can help you improve your marketing strategy. In short, Google Analytics is a robust utility that helps you make data-driven decisions to enhance your web presence.

 

This section looks at how to install Google Analytics and incorporate it into your website. First you need a Google account; for example, if you use Gmail, YouTube, Google Drive or any Google primary service, you can use that account for Analytics.

 

However, always make sure you have complete access to and control over that account. The business or website owner should be the only person who fully controls that primary account so that they have access to it at any time from any location. If you do not have a primary Google account, create one, and make sure you control every aspect of that account.

 

 Select Website (the default) if you want to set up Analytics for your website. Enter the account name in the Account Name text box, and then entire the name of your website in the Website Name field. Next, enter the web address or URL in the Website URL text box.

 

Select the name of the industry from the Industry Category drop-down menu, and select the Time Zone. Choose the appropriate Data Sharing Settings, and then click the Get Tracking ID button. When you do, you’re given a tracking code for verification purposes.

 

Once you receive a tracking code, depending on your content management system (CMS), you need to incorporate the tracking code into your document. If you are using Wordpress, you can use the Google Analytics by Yoast plug-in.

 

If you built your site with simple HTML markup, then you need to include the tracking code in the <head> section of your HTML document, just before the closing </head> tag. The following post explains the procedure for installing Google Analytics for different platforms: www.lizlockard. com/installing-google-analytics-on-any-website/.

 

Next, verify your account using the options in the Google Search Console (discussed in the next section). Once you have set up Google Analytics, you can add users and set permissions. You can define goals to understand when important actions occur on your site.

 

You can set up Site Search, which helps you gain insight about searches made on your website. You can get an analysis in a day, provided your setup is implemented correctly. Every time you log in to your account, go to the Reporting section to view the Dashboards, Shortcuts, Audience, Acquisition, Behavior, and Conversion menu items.

 

Following are some common terms you will come across while viewing analyzing these reports:

Dimensions: An attribute of your site visitors that can have various values. For example, gender, browser, city, and so on.

 

Metrics: A measure of elements of a dimension. For example, new users, bounce rate, sessions, and so on.

 

Sessions: Period of user engagement with the website for a specific date range. For example, the time is taken for page view during a certain date range.

conversion

Conversions: Count of the goals completed on your website. For example, purchasing items on an e-commerce site can be a goal, and conversions relate to users who have visited the site and

 

Bounce rate: Percentage of single visits where the site visitor leaves the site without any other action. For example, a user may just click the back button if they do not get relevant information when they visit your site’s home page for the first time.

Bounce rate

Audience: Report items that provide statistics depending on traffic and help you gain insight into a site visitor’s behavior on arrival on your site.

 

Acquisition: Report items that provide information about the source of traffic, actions performed on the site, and whether conversions occur during the sessions.

 

In addition to the Reporting menu items, you can create custom reports and custom variables to get the most out of the Analytics data. A plethora of third-party utilities and plug-ins are available, such as reporting tools and products that use the platform for Analytics-based tracking.

 

In addition, there is a Google Analytics premium platform that enterprise-grade and large-scale organizations can use to obtain advanced results.

 

Google Search Console

Google_search

Google Search Console (formerly known as Google Webmaster Tools [GWT]) is a free utility to analyze your website’s indexing and manage and monitor your site’s Google Search ranking.

 

It helps you probe and gain insight into how your site appears to the Google search engine. You can also get information related to hacking attacks, penalties, and broken links, along with suggestions that can help you improve and manage the presence of your website.

 

To use Google Search Console, you need to have a primary Google account. Once you log in to the Search Console for the first time, you see the web page shown

 

Search Appearance

This section looks at the factors that determine how your site appears to the search engine.

Structured Data

Structured Data

All the structured data for your website is found on this page. The utility helps you gain insight into the structured data elements and defects related to the markup. You can download a report to view errors associated with structured data elements on your site.

 

Data Highlighter

This feature is useful because it helps you implement structured data if you cannot access the back end.

 

HTML Improvements

All HTML-related errors, such as title tag errors, lack of opening and closing tags, meta descriptions, and non-indexable content, are flagged. You can then make changes as per the generated report.

 

Sitelinks

Sitelinks are shortcuts generated by Google and used for deep website navigation. You can even site links to a specific page if you find them irrelevant or incorrect.

 

Search Traffic

Search Traffic

This attribute helps you gauge the keyword search phrases used to make your page visible in search results. You also see a count of how many times your website is reflected in the search results using specific keyword phrases.

 

The Click Through Rate, backlinks, and inbound links from different domains are listed in this section. You can also estimate the number of internal links in your website and learn about basic analytics for your site.

 

Search Analytics

In this section, you can view organic traffic and filter it according to devices, region, and type of platform (desktop or mobile). You can gain insight into trends and performance of your website in the search results

 

Links to Your Site

Here you find information about inbound links and the domains from which these links originate. You can also get information about the anchor text pointing to your website and pages with the highest-valued inbound links.

 

Manual Actions

This section is crucial because messages related to Google penalizing your website are delivered here. You also see suggestions regarding appropriate fixes related to any penalties.

 

International Targeting

The hreflang attribute is used to identify the language along with the geographical targeting of each web page. The second tab (Country) is handy if you want to attract a customer base from a specific country.

 

For example, domains that are generic have .com and .org extensions, whereas extensions such as .in, .uk, and .fr (for India, United Kingdom, and France, respectively) are used to target specific countries.

 

Mobile Usability

Mobile Usability

You can find out whether your website is mobile friendly. You can see its various drawbacks from the mobile point of view along with an overview of those issues. Mobile data is used more than desktop versions, so Google considers mobile usability a crucial ranking factor.

 

Google Index

This feature helps you get information about the pages of your site that have already been indexed and also helps you remove URLs that are inappropriate or incorrect.

 

Index Status

Here you can identify pages that Google has indexed on your site.

 

Content Keywords

Content Keywords

You can gain insight into keywords or phrases often found on your site.

 

Blocked Resources

Pages on your site that are configured to be blocked by using a robots.txt file (Robots Exclusion Standards) are reflected in this section. For example, tracking programs, external scripts, and private pages are some examples of blocked resources that need not be indexed by Google.

 

Remove URLs

You can remove pages that are not supposed to be indexed using this feature. For example, if you want to remove a web page that has been indexed, you initially need to block it by configuring it in the robots.txt file. Then you can send a request for removal of that specific URL.

 

Crawl

Prior to the indexing of your web pages, the pages need to be scanned or crawled by Google bots or spiders. Information related to the crawling of your site can be viewed here.

 

Crawl Errors

Crawl Errors

This attribute indicates any errors that were found while crawling your web pages: for example, a “404 Page not found” error. Once you fix the errors, you can update Google.

 

Crawl Stats

This indicates the number of pages crawled for a certain period. It also indicates the download time and the size of the download.

Fetch As Google

This attribute is handy for understanding how Google renders a web page. Instead of Fetch, you can use Fetch and Render to enable Google to crawl and display the web pages, akin to a browser rendering the pages.

 

You can see the difference between rendering the page with Google and with a browser, helping you boost the SEO processes for your site.

 

Robots.txt Tester

This attribute checks for defects in your robots.txt file.

 

Sitemaps

This facility helps you submit the XML sitemap of your site to Google. It makes it easy for Google bots to dig deep, because it makes web pages more accessible to those bots. Errors related to the submitted sitemap are also reflected in this section.

 

Sitemap

sitemap

A sitemap contains a list of pages within a website. A sitemap can exist in a form accessible to crawlers, humans, or both. Typically, a sitemap is hierarchical in nature. 

 

The purpose of a sitemap is to display, how the website is organized apart from navigation and labeling. It provides clarity about the relationship between various website pages and components.

 

Certain types of sitemaps, such as XML sitemaps, provide additional information about the pages in a website, such as the last update and frequency of updating.

 

Sitemaps tell search engines about the content type on the listed pages: for example, audio, video, or image. The information displayed about an image or a video content type may include the video duration, category, age rating, image description, type, and licensing details. Search engines crawlers (also known as bots or spiders) scan and discover most web pages.

 

However, broken links can stall the discovery of some pages. A website may contain a lot of pages, a large amount of isolated archived content, or recently launched pages that need to be crawled, or there may be external links that point to it.

 

Also, a website may have rich intuitive content such as media files that need to be crawled. If search engines are not deep-crawling a website, those pages do not populate at the top of the SERPs.

 

There is no guarantee that all the items in a sitemap will receive enhanced exposure or be crawled, but providing sitemaps is a good practice because it definitely makes it easier for the search engines.

 

Types of Sitemap

There are two types of sitemaps:

  • XML sitemap
  • HTML sitemap

 

An XML sitemap is in an XML format that is tailored for the search engines. It is used to indicate information about various pages. However, it is not user-friendly, because it caters to search engines and not human users.

 

An XML sitemap sheds light on information such as the relationships between pages, their update history, and the frequency with which they are updated.

 

On the other hand, an HTML sitemap is user-friendly and tailored for human users. It helps them find the page containing content they are looking for. Because it is user-friendly, it also makes the website more accessible to spiders. 

 

You need to remember that XML sitemaps cater to search engine crawlers, whereas HTML sitemaps cater to human users. In addition, HTML sitemaps are not supported in the Google Search Console utility.

 

Difference between HTML and XML Sitemaps

As mentioned earlier, XML sitemaps are developed for search engine spiders and contain the website’s page URLs along with additional information such as update history. Listing 6-1 shows the XML sitemap created for one of our client websites, http://techvictus.com/.

 

Making the Choice

It is a good practice to use both XML and HTML sitemaps for your website. XML sitemaps provide the search engines with information related to all the pages on a website. They also include pages that are not directly connected to the home page, such as posts.

 

HTML sitemaps, on the other hand, contain a list of pages along with their links. This enables users to visit specific pages on your website, eliminating the hassle of navigating through all the pages. In addition, such a list helps track which pages are being visited the most by users.

 

Creating a Sitemap

Creating a Sitemap

After reading the previous sections, you may be ready to add a sitemap to your website. In this section, you create an HTML sitemap for the http://www.techvictus.com/ website. 

 

This HTML sitemap enables users to find the correct path to the content they are looking for. If you want information related to the organization, you can click the About Us link listed in the sitemap.

 

This sitemap uses the <br> (break) tag. Instead of <br>, you can also use the unordered list tag (<ul>); it depends on your preference.

 

Once the sitemap is generated, it can be edited, copied, and uploaded to your website server. By default, you need to upload this XML to the root of your website. 

 

If you want to upload it elsewhere, you need to add an entry in the robots.txt file so the search engine can reach this information. If you generate an HTML sitemap, you can link it to the footer of your web page so users can easily navigate to a particular web page.

 

Popular Sitemap Generators

Generating a sitemap for a small website is usually not the time to consume. However, this can be a very time-consuming activity for large, complex websites. Performing manual sitemap generation on such websites is not only expensive with respect to time but also prone to errors. 

 

In such situations, sitemap generators come in handy. A sitemap generator is a tool that takes the URL of the website as input. It then scans the website and lists all the pages it contains in XML or HTML format. This section looks at some of the widely used sitemap generators.

 

MindNode (https://mindnode.com/)

MindNode is a visual tool tailored for Apple devices only. You can build sitemaps as well as plan your projects in a visually appealing way using this utility. The only constraint is that you export to XML. However, you can use the PDF, image, or simple text version generated using this app.

 

WriteMaps (https://writemaps.com/)

WriteMaps is an efficient utility for online sitemap generation. You can create three sitemaps for free using a single account. Its intuitive character helps you create sitemaps as well as customize colors, map page content, and easily format content. The generated sitemaps can be exported in PDF as well as XML formats.

 

DynoMapper (https://dynomapper.com/)

Dyno Mapper is an awesome ecosystem used for creating sitemaps. In addition to building sitemaps, it provides features such as Google Analytics support, website accessibility testing, and keyword tracking that help boost your SEO projects significantly.

 

It comes with drag-and-drop functionality, it can detect broken links, and it provides cloud support with regular content audit input.

 

URL Parameters

Although Google bots can usually understand URL parametersif there is any ambiguity, you can explicitly configure parameters using this section so that the Google search engine understands them more efficiently.

 

For example, suppose a user is on an e-commerce portal and wants to shop for athletic shoes. The user can use filters such as the sole material, leather or synthetic, color, and price of those shoes.

 

The filters when used lead to represent a different URL to the user; however, strings appended to the URL due to using combinations of filters show different URLs to users for the same or duplicate content. Google has a workaround for this.

 

Similar URLs are grouped in a cluster, and then Google decides the best URL in that cluster to represent the cluster group. In case of ambiguity, if Google is unable to decide the best URL for that cluster, you can use the URL Parameters facility to specify which URL should be used to represent a group.

 

Security Issues

Security

If your site is hacked or there is some other issue like a malware attack, you can find information and the suggestions for fixing it in this section.

 

Site Settings

In the upper-right section of the page is a gear icon that leads to Site Settings. You can choose between www and non-www versions of your website. This is done so that both sites are treated the same.

 

For example, if your site’s web address is http://example1234567865.com and the other link is http://www.example1234567865.com, selecting one as the preferred domain ensures that both URLs are treated the same.

 

However, you need to verify that you are the rightful owner and authority for both sites. You can also set the crawl rate here, so Google will crawl pages keeping in sync with the bandwidth of your site.

Google Search Console is a vast topic by itself and is a vital tool for understanding the nuances of your website.

 

Obstacles in SEO

Obstacles in SEO

There are certain blips you come across when implementing SEO in projects. These hurdles hinder the SEO workflow significantly and can affect your website’s visibility. 

 

SEO associates tend to tweak websites for search engines and forget to focus on the user experience. An appropriate approach is to design the website for users (user-centric) and then tweak the website for search engines.

 

There are several bottlenecks that we tend to overlook when we are implementing SEO. These bottlenecks can be game-changers that reduce your site's visibility in SERPs. In this blog-Section, we will discuss the obstacles in SEO and include suggestions to overcome these drawbacks.

 

Black-Hat SEO

Black_hat_seo

Despite knowing that black-hat SEO will result in penalties, some SEO experts resort to underhanded techniques. Link farming, cloaking, keyword stuffing, irrelevant content, and spamming are black-hat techniques that are still in use.

 

The results may seem positive, but eventually, Google and other search engines realize that they are being duped, resulting in a penalty.

 

Let’s consider the cloaking black-hat SEO technique. It is akin to spamdexing, where the content presented to the search engine crawlers is different than the content presented to human users.

 

It is a deceitful way of achieving higher rankings: the content delivered is different depending on IP addresses or HTTP headers. This manipulative technique tries to trick search engines into believing that the content is the same as what users see.

 

Another black-hat SEO technique is link farming, where sites exchange reciprocal links to boost their rankings by fooling the search engines. It is different from link building, which is an organic way of boosting rankings.

 

Because search engines such as Google and Bing rank sites based on their popularity and inbound links, some SEO consultants used to implement link farming to get links from hundreds of websites that were not even slightly related to the target site. Some SEO consultants also had link farms that used devious ways of exchanging links with other sites, something like a link-exchange program.

 

For example, suppose a site is devoted to troubleshooting Windows OS issues. If inbound links come from sites such as Stack Overflow, authority sites, and relevant blogs, then they are legitimate.

 

However, if such a site receives inbound links from travel and tourism sites offering vacation packages in Miami or from sites offering plumbing solutions in Ibiza, then there is no relevance—the sites have no connection. Therefore, such sites use link farming and are deceitful because they just want to boost their rankings using these underhanded techniques.

 

Instead of fooling the search engines, it is better to use white-hat SEO techniques that are beneficial in the long run.

 

Quality link building, using social media appropriately, and engaging users with solid content are some of the white-hat SEO techniques. It may take weeks or even months for the results to show, but white-hat techniques are the norms that SEO experts must follow to gain visibility in SERPs.

 

Irrelevant Content

content

Content is king. However, if you use duplicate content or inappropriate methods such as keyword stuffing in your content, you are bound to be penalized. Content must be relevant and must engage users.

 

Fresh content is also an essential factor because the search engines show an affinity for fresh, quality content. Write content for users, and then tweak it to optimize it for the search engines.

 

Targeting the Wrong Audience

Your SEO implementation must be optimized for an appropriate target audience. If you do not target the right audience, your efforts will be wasted.

 

For example, gaming consoles, portable music players, and MP3 gadgets are appealing to youth, whereas long-term retirement plans are better suited for middle-aged users.

 

Competition

Competition

Small- and medium-sized businesses do not have a huge budget for advertising their products and services. Therefore, they need to ensure that they do not try the same approaches as large-scale enterprises that have a panel of SEO experts, funding, and expensive advertising methods.

 

You should also avoid using metadata and content similar to that of large-scale enterprises because doing so will hinder your SEO process.

 

Using keywords prevalent in the web pages of enterprise-grade organizations is detrimental because small enterprises do not have the budget, web presence, and reach for mass-scale advertising to outperform large adversaries.

 

You can use Google My Business and other Google tools (as well as third-party enhancements) to gain visibility. You can also use keywords that have less competition and target a niche market. You may see your website gain prominence on SERPs using such low-competition keywords.

 

Overlooking Social Media as a Medium

Social Media

Social media marketing is no longer an optional method. It is mandatory to take advantage of the scope and reach of social media to create awareness of your brand.

 

Many organizations neglect social media, and this limits exposure to your site. This doesn’t mean you should over-optimize social media strategies by using every social media app in the market. You also have to be relevant.

 

For example, you can advertise by using a concise video about your product or service on YouTube. You can tweet about your latest product update on Twitter or write a blog on WordPress that is relevant to your product.

 

Users love to read fresh and engaging content, and so do the search engines. You can also use backlinks to your site and outbound links to relevant sites (such as linking a term to Wikipedia), which will benefit users looking for informative content.

 

Ignoring UX for Your Website

UX

Your site may have lots of jazzy features, but if your users cannot navigate easily or find it difficult to access content, then the result may be a shabby UX.

 

For example, the Add To Cart button on an e-commerce website must be easily accessible to users. UX is a crucial factor for search engines because they like sites that are popular and have a high degree of usability.

 

Missing XML and HTML Sitemaps

XML and HTML sitemaps are designed for search engines and users, respectively. Your site may have the latest updates and game-changing features.

 

But if the search engines are unable to crawl and map your site, that is detrimental to your SEO workflow. You must submit XML and HTML sitemaps to the search engines, so your deep web pages can be crawled more easily.

 

Slow Page-Load Time

Page-Load Time

Slow page load time is a deterrent to SEO processes. Code-heavy pages, uncompressed and unoptimized HTML, images, external embedded media, extensive use of Flash, and JavaScript result in slow page-loading times.

 

There can be other factors, too, that may result in high page-load times: using server-centric dynamic scripts, non-optimal web hosting, and lack of required bandwidth. These factors negatively affect your SEO processes and are major hindrances resulting in lower rankings.

 

For example, SEO experts recommend 2 to 3 seconds as the optimal loading time for product pages on e-commerce sites after analyzing statistics from research and analytics.

 

Moreover, surveys and studies related to site speed infer that users tend to abandon a site that isn’t loaded within 3 to 4 seconds. This also represents a shabby user experience, resulting in lower conversions and sales.

 

Using Flash on Your Site

In the early days of the web, Flash was an awesome resource that helped site build intuitive modules and impressive page elements. However, with the advent of HTML5, this cutting-edge utility has taken a backseat.

 

Moreover, as the mobile and tablet market has become a dominant force, the use of Flash is considered redundant, because Flash was tailored for desktop users.

 

Flash is more prone to malicious malware hacking, from a security point of view. Its non-scalable features mean you cannot minimize or expand it or set a viewport for it.

 

Everything you can do in Flash can be done more quickly and easily in HTML5. You can also use the latest web design frameworks to build interactive sites, thereby relegating Flash to the sidelines.

 

Google and other search engines cannot read Flash, although recently, Google claims it can index the text in Flash files. Considering the pitfalls associated with Flash, you should use HTML5 to design and develop interactive sites with enhanced animation and special effects.

 

JavaScript Accessibility Issues

JavaScript

Client-side JavaScript, extensively used in single-page applications, helps you build dynamic and highly intuitive websites. However, search engines cannot parse JavaScript-generated content efficiently. Currently, only Google’s search engine is able to understand JavaScript, albeit at a more basic level.

 

With frameworks such as Angular and Ember being used to build dynamic web pages, search engines will have to develop the ability to understand complex JavaScript; so far, this ability is evolving. There are a few workarounds you can implement to tackle JavaScript accessibility issues.

 

Suppose your browser is an old version and cannot parse the latest features and functions of JavaScript. You can use fallback code (also called polyfills) to replicate the content of JavaScript-based web applications. Using fallback code is an option for tackling the JavaScript issue using server-side rendering.

 

However, it requires a lot of effort and costs more because you must develop code for all the features on the website; plus there is heavy code maintenance. It also makes your pages code-heavy, resulting in higher page-loading times.

 

Another aspect to be considered is the workaround for applications built using JavaScript frameworks such as Angular and Backbone. When search engines and social networks crawl your pages, they only see the JavaScript tags.

 

To ensure that these dynamic pages are accessible to the search engines, you can use a prerendering service such as Prerender (https://prerender.io/) or SEO.js ( http://getseojs.com/).

 

The Prerender middleware checks all requests and, if there is a request from a search engine spider or bot, sends a request to Prerender.io for the static HTML for that JavaScript page.

 

The Prerender service uses PhantomJS to create static HTML, which in turn is submitted to the spiders for crawling. Prerender can be used for most of the JavaScript frameworks.

 

An alternative to Prerender is SEO.js. Once you submit a website to the SEO.

js dashboard, the utility service visits your web pages and creates HTML screenshots for each page. An added advantage is that the snapshots are updated automatically.

 

Therefore, when a search engine’s spiders or bots visit your site, they see only fully rendered content from the snapshot. You can also use SEO.js to create XML sitemaps to enhance the accessibility of your web pages to the spiders.

 

Now that you have learned the core obstacles that hinder SEO, let’s look at Google’s Accelerated Mobile Pages (AMP): a new concept that will change the face of the mobile web.

 

AMP

AMP

Accelerated Mobile Pages (AMP) is a Google project aimed at the mobile web. Akin to the concept of Facebook’s Instant Articles and Apple’s Apple News, AMP pages will change the way people perceive the mobile web.

 

AMP pages are web-based, meaning they are rendered in a browser. They are independent documents that are sourced from your web server. Optionally, you can store AMP documents in a CDN cache to render them more quickly.

 

AMP pages are made up of the following modules:

  • AMP HTML
  • AMP Runtime
  • AMP Cache

 

While responsive websites face issues such as rendering heavy-duty desktop content on a mobile website, JavaScript bloat, and sluggish speed on the mobile platform, AMP pages are designed for the mobile platform and help users view site pages efficiently across various mobile and tablet sizes.

 

JavaScript is baked-in for AMP Runtime, which manages the loading of AMP modules along with features such as runtime validation for AMP HTML. It defines the priority for resource loading, thereby resulting in an optimal page-loading experience.

 

AMP HTML documents can be stored on your server, and you can use your own CDN; but you can also take advantage of the benefits of using Google’s CDN, which streamlines the SEO processes built around these pages.

AMP HTML

When you search a page on Google, you see the results on desktop browsers. However, on the mobile platform, there is a high probability that Google will direct you to AMP pages rather than regular pages because AMP load instantaneously and the Runtime streamlines the utilization of available resources.

 

Unlike Facebook’s Instant Articles or Apple’s Apple News, AMP pages (although backed by Google) are portal-agnostic and open-source. They are supposed to be built-in with ads and analytics support.

 

AMP are accessible from any portal: Google Search, Pinterest, or anywhere online. In summary, AMP pages are a lucrative alternative to heavy-duty websites; they display information quickly and render content effectively without the bulk or clutter.

 

Go to www.theedesign.com/blog/2016/year-of-google-amp to see the difference between a normal web page and an AMP page. You can find out more at https://www.ampproject.org/

Recommend