Best OnPage SEO Checklist FollowUp

OnPage SEO Checklist Guidelines

SEO Guidelines

SEO Checklist: We often describe SEO as Generating Backlinks to increase referral traffic to our website to ultimately improve our Domain Authority. While updating the website, plugins or adding new functionalities, we often cause some accidental errors due to which our overall functionality of the website is immensely affected and thus our website ranking. Some errors are so dangerous that can slow down your website to High Load Time(HLT) slab, increasing the bounce rate and ultimately putting your site to the ground.

With Google RankBrain, user experience should be everything for you and to improve that On-Page SEO in a must.

To learn more about Google RankBrain read here >> SEO Tips Hack Google RankBrain

 

 

On-Page SEO Checklist to follow periodically

1. Check your site for broken links

Pages with broken links (be it an internal or external link) can potentially lose rankings in search results. Even if you do have control over internal links, you do not have control over external links.

There is a huge possibility that a web page or resource that you linked to no longer exists or has been moved to a different URL, resulting in a broken link.

Therefore, it is recommended to check for broken links periodically.

Tools: Google Search Console, ScreamingFrog

2. Use the site command to check for the presence of low-value pages in the Google index

Type “site:website.com” to display all pages on your site indexed by Google.

By roughly scanning through these results, you should be able to check if all pages indexed are of good quality or if there are some low-value pages present.

Quick Tip: Change the Google Search settings to display 100 results at a time to easily scan through all results quickly.

For Example: A low-value page would be the ‘search result’ page. You might have a search box on your site, and there is a possibility that all search result pages are being crawled and indexed. All these pages contain nothing but links, and hence are of little to no value. It is best to keep these pages from getting indexed.

Another example: would be the presence of multiple versions of the same page in the index. This can happen if you run an online store and your search results have the option of being sorted.

Here’s an example of multiple versions of the same search page:

  •         http://sitename.com/products/search?q=chairs
  •         http://sitename.com/products/search?q=chairs&sort=price&dir=asc
  •         http://sitename.com/products/search?q=chairs&sort=price&dir=desc

You can easily exclude such pages from being indexed by disallowing them in Robots.txt, or by using the Robots meta tag. You can also block certain URL parameters from getting crawled using the Google Search Console by going to Crawl > URL Parameters.

 

3. Check Robots.txt to see if you are blocking important resources

Blocking any content in your Robots.txt would mean blocking it from getting crawled. Thus, the content will not be accessible through Google organic traffic.

In the same way, if Google bots cannot access the JavaScript or CSS on your site, they cannot determine if your site is responsive or not. So even if your site is responsive, Google will think it is not, and as a result, your site will not rank well in mobile search results.

To find out if you are blocking out important resources, log in to your Google Search Console and go to Google Index > Blocked Resources. Here you should be able to see all the resources that you are blocking. You can then unblock these resources using Robots.txt (or through .htaccess if need be).

To double check if these resources are now crawlable, go to Crawl > Robots.txt tester in your Google Search console, then enter the URL in the space provided and click “Test”.

4. Check the HTML source of your important posts and pages to ensure everything is right

The HTML source is the best way to ensure that all of your SEO-based meta tags are being added to the right pages. It’s also the best way to check for errors that need to be fixed.

How to check:

  1. Open the page that needs to be checked in your browser window.
  2. Press CTRL + U on your keyboard to bring up the page source, or right-click on the page and select “View Source”.
  3. Now check the content within the ‘head’ tags ( <head> </head> ) to ensure everything is right.

 

What to look for:

  1. if the pages have multiple instances of the same meta tag, like the title or meta description tag.
  2. if the page has a meta robots tag, and ensure that it is set up properly.
  3. If it is a paginated page, check if you have proper rel=”next” and rel=”prev” meta tags.
  4. if pages have proper OG tags (especially the “OG Image” tag), Twitter cards, other social media meta tags, and other tags.
  5. if the page has a rel=”canonical” tag and make sure that it is showing the proper canonical URL.
  6. Check if the pages have a viewport meta tag. (For mobile responsiveness.)

5. Check for mobile usability errors

Even though your site is responsive, its unpredictable what Google or Bing bots will think. Even a small change like blocking a resource can make your responsive site look unresponsive in Google’s view. Thus, keep a check.

Tool:  Google Search Console. Search Traffic > Mobile Usability (check if any of these pages show mobile usability errors).

6. Check for render blocking scripts

The more JavaScript and CSS references a page has, the longer it takes to load.

If you find unwanted scripts, restrict these scripts only to pages that require them so they don’t load where they are not wanted. You can also consider adding a defer or async attribute to JavaScript files.

 

7. Check and monitor site downtimes

It’s imperative to monitor your site’s uptime on a constant basis.

Tools:  Uptime Robot, Jetpack Monitor, Pingdom, Montastic, AreMySitesUp, and Site24x7. If your site experience frequent downtime, it’s time to switch the webhost.

 

Other Important SEO Checklist Points:

  • Keep Meta Descriptions between 150 and 160 characters.
  • Keep Heading tags between 15 and 65 characters.
  • Add alt tags to all images.
  • Add keywords in Site Title, Meta Description and Heading Tags.
  • Minify JavaScript and CSS.
  • Use Gzip compression to reduces the response size by up to 70%. Just add a code to .htaccess file and your good to go. Check here >>> https://gtmetrix.com/enable-gzip-compression.html

Please leave your comments below, tell me how much you like it and what should be added to this to make it complete according to you. Thanks 🙂

To learn more about Google RankBrain read here >> SEO Tips Hack Google RankBrain

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.