Website SEO aka Search Engine Optimization is important for every modern website. However manually validating SEO criteria is quite cumbersome. It is possible to automate most of the SEO checks using python.

In this article, I’m going to demonstrate some ideas using basic python code snippets. These scripts will work as a good starting point for writing tools that can provide insights about Website SEO. We can take help of various python libraries that are available in pip however we will be using requests_html for now.

Installation

To install requests_html, open up your terminal and type:

pip install requests_html

and that’s it!

It will download all the dependencies and install requests_html. If you face any issues search it on Internet and if you still can’t fix it then feel free to comment below I will try my best to help you.

1. Check gzip/brotli Compression enabled

Compression is necessary for Good performance. Enabling compression for various resources such as js, css etc will improve website performance.
So, we will begin with checking if the website has any content compression or not. The script for this will be quite simple. Even you can do it using an interpreter as well.

import requests_html
session = requests_html.HTMLSession()
r = session.get("https://wasi0013.com/")
print(r.headers.get("Content-Encoding"))

In the above script we are checking the “Content-Encoding” from the request headers to find out whether compressions are enabled or not. If gzip compression is enabled it will output gzip or, if brotli compression is enabled it will print brotli

2. Find Broken Links

The next script that we will write is to check broken links in our website. We will have to make sure there are no broken links as just like Google many other search engine bots dislike broken links and usually penalized website. To find a broken link we can actually use many python tools such as scrapy, requests etc. However, as we already have requests_html installed why not write the script using requests_html? Lets do it!

3. Find Images that needs Optimization

Slow website loading hurts SEO. Even Google ranks faster websites higher. One of the most common reason of website’s slow loading is Unoptimized images i.e. images that are not optimized for web and uploaded as is (in original size without compression). Using the following script we can easily trace such drawbacks on any website.


4. Check HTTP/2 Supported

To check if a website supports HTTP/2 I usually use keycdn website. However, visiting that website and submitting an url just is quite time consuming… So, we will automate this task using a simple scraper like below:


Note the script relies on the layout of keycdn website hence, it might need update/fix to make it work if it is broken due to any recent changes in their website.

We wrote only four scripts so far, but it is possible to write tons more… Say for example, Script for getting the ttfb time or, Time To First Byte time. Or Script to check whether the website is using js that blocks render i.e. detect whether contents are generated using Javascripts or not, Or perhaps do some keyword analysis using libraries like nltk and combining it with Google Trend’s python library we can get some useful insights of Website SEO.

Advertisements

About the Author ওয়াসী (Wasi)

Crawl, Scrape, Extract, Compare and automate scripts for Data Mining & Analysis, Web Scraping, Lead Generation etc. Contact me for more details about my services.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.