03/08/17 // Written by Emma Phillips

Detecting Indexation Issues

So, you have a nice shiny website, and full of enthusiasm, you anxiously wait for rankings and traffic to mature to no avail. Content is great and you have spent weeks, if not months, building up the content and beta testing within your organisation and key stakeholders.

Progress if any is slow. Panic ensues and best practice is to go through some very simple tests to see if the most common issue – crawling, caching and indexation – are stalling your progress.

More likely than not, the failure to run simple tests at the final stages of launch may be stopping progress and it may be an easy process to turn off some of the measures to control early indexation.

Check Crawlability

Using free and commercial software, one of the very simplest checks is to crawl the site and check whether there are any restrictions still on. Use Xenu, Screaming Frog or other crawl software to see if there are any controls in place to stop crawling. You will know this within a matter of moments if the crawl ends abruptly.

What are you looking for?

When you launch a new site and progress is slow, the first thing you should look for is whether there is a sitewide blanket rule of <META NAME=”ROBOTS” CONTENT=”NOINDEX, NOFOLLOW”>.

This or other combinations that have a NOINDEX tag will stop Google crawling and indexing the site. The NOFOLLOW if implemented at the start point of your crawl (usually the homepage) will stop most crawlers in their tracks.

Remove any robots restrictions and set to follow and re-run crawlers. Full results from crawlers can then be analysed at a more granular level to see if there are any further restrictions stalling progress.

X-Robots Checks

While many commercial crawlers will pick up server-side x-robots directives, free to use software is not as thorough at finding this often-overlooked check. Hard to find as not within the html of a page, x-robots can stop search engines from crawling, indexing and caching your content.

Our tool of preference for checking any x-robots issues is a Chrome or Firefox plugin by Chris Pederick called Web Developer. Full documentation is provided on their website and the process involves looking at the Response Headers from the large range of functionality within this easy to use plugin.

WordPress Checks

Many of us work with WordPress for the Blog content that will help with long tail rankings. This is where many forget to tick the simplest of boxes that is fundamental to launching your site and can often leave website owners pulling their hair out.

Remember when you started the website and you or your developers diligently ticked the box that says “Discourage search engines from indexing this site”? Untick this and your site is good to go.

If you are looking for further advice to manage your SEO campaigns, contact our experts today!