How do I prevent Site Auditor from crawling pages of my website?



There are two ways to prevent Site Auditor from crawling pages on your website: adding disallow rules into your robots.txt file and adding Path Exclusions in Site Auditor itself. We recommend using Path Exclusions for the most flexible exclusions possible:

There are two ways to add an exclusion to your crawl from within Site Auditor. The first is through the Tool Options menu and the second is within the Content section.

Tool Options

The settings in Tool Options > Settings provide the most flexibility for adding path exclusions:

  1. From within SEO Research > Site Auditor, click the Tool Options button and select Customize Settings.
  2. On the Settings page click the Create New Exclusion button.
  3. Enter the URL you would like excluded from your crawl. You are able to exclude folders, specific files and even use wildcards to exclude query strings and other, more complicated, URLs.
  4. Click the Create New Exclusion button to save your exclusion.

This will save the exclusion in the Path Exclusions list. You can delete this exclusion by clicking the gear icon and choosing Delete from the menu.


The second way to create exclusions for your crawl is from within the Content tab of Site Auditor.

  1. Navigate to SEO Research > Site Auditor > Content and click the gear icon for the URL to exclude.
  2. From the gear icon, select the Exclude URL from Future Crawls option.
  3. You will see a message at the top of your screen indicating your URL has been excluded.
Have more questions? Submit a request