Basic Guide to SEO Audit

When I started at evolutia, I had never done a SEO audit. I understood the principles of what a SEO audit was but I never got the opportunity to do one myself. My first SEO Audit was a long and tedious process as I had to get to grips with all the tools available to me. Now, I am in no way insinuating it is an easy task, because regardless of how often you have done it before, it is not easy. But if you know what to check for on a website and more importantly know which tools are available you can use, it will make your SEO audit a lot easier.

While the algorithms for the Search Engines are ever changing, the basics of SEO have generally remained the same. Google provides plenty free tools that are an excellent starting point and more often than not, are sufficient for a decent SEO audit.

crawl, read and index

First things first, you need to make sure Google can crawl, read and index your pages. Google makes this really easy for you. In Google Webmaster Tools > Site Configurations > Sitemaps you will see a list of sitemaps you have submitted for your site. If you don’t, you may have not submitted a sitemap yet.

Here in Sitemap you will be able to see if your website has any indexing problems – check the number of “URL’s submitted” compared to “URL’s in Web index”. These should be the same, but they won’t be as Google only checks the pages that are listed in the sitemap and many pages will be blocked by the robots.txt. Another reason will be duplicate URL’s in the sitemap, as the index will only ever show unique URL’s.

Another thing to check is whether the pages are getting crawled by simply checking the “Cache” of a website. The easiest way to do this, is to simply type “cache:http://www.yourwebsitehere.com”. At the top it will show you when Google last crawled a website. You want the Search Engines to crawl a website frequently, so keep it updated.

While you are in the cached version of the website check if the content is visible to the Search Engines – click on “text-only-version”, which disables Javascript, CSS and Cookies. Compare the text version (how the search engines see the page) to the user version. If the website is displaying different data to the search engines than it is to the users, it is called “cloaking” and can impose penalties to the website.

Robots.txt

A robots.txt can help you to block search engines crawling certain pages or folders of a website. If used incorrectly, a robots.txt can cause all sorts of problems. If you look at the cache of a page and there is no cache or it hasn’t been crawled for a long time, check the robots.txt to see if it has been blocked.

Again Webmaster Tools make it really easy see what pages or folder are being blocked. Select your website and go to Site Configurations > Crawler Access . It should be fairly easy to see if the robots.txt is blocking any pages or folders that shouldn’t be blocked from the search engines. Find out more about robots.txt here.

Canonicalization

Canonicalization happens when multiple URLs share the same content, similar to duplicate content. Examples are:

  • http://www.yourwebsite.com
  • http://yourwebsite.com
  • http://www.yourwebsite.com/index.html
  • http://yourwebsite.com/index.html
  • http://www.yourwebsite.com/INDEX.html
  • https://www.yourwebsite.com/index.html

All these pages will most likely show the same content and hence can be linked to from different external websites.

This is a big deal, as it devalues the website and all the links that the one website would receive are spread across all these different URL’s and therefore manipulate the rankings. Checking your canonicalization is crucial; otherwise you might end up with identical pages competing with each other.

Identifying this problem is also pretty easy: Just copy a unique sentence from the content, paste it into Google inside quotes. Ideally only one page should show up, the one you took the sentence from. If you see more results however, than you might have duplicate content issues. In this case check the URL and see if it caused by a canonicalization issues!

Check Redirects/Crawl Errors

Webmaster Tools can be helpful here too, by checking what errors Google receives from crawling the website. In Diagnostics > Crawl Errors you will find a list of the problems Google finds crawling your website. Expect to see 404’s, nofollow, redirects, timed outs, etc. You will also see if an external link is pointing to the wrong URL.

Check the redirected URLs response with an HTTP headers check.

Title tags/H1 tags

Should there be any issues with duplicate, missing or short title tags, H1 heading tags or meta description then Google Webmaster Tools can quickly help you! Simply go to Diagnostics > HTML Suggestions and you will find a list of the problems that Google has found.

If there are duplicate title tags then that might lead to “keyword cannibalization”, which means that more than one page on your site are competing to rank for the same keyword. While you are checking for duplicate title tags, make sure the keyword or search term is before the brand name.

Latest posts from the SEO Blog

Let’s work together...

Free Website SEO Review

If you’re reviewing the effectiveness of your existing web presence, or thinking about starting something new...

Get your free SEO proposal now »