How to Prevent Site Auditor Studio From Crawling Pages on a Website


There are two ways to prevent Site Auditor Studio from crawling pages on a website:

  1. Add path exclusions to the Site Auditor
  2. Add disallow rules to the robot.txt file

The path exclusion option is recommended as it offers the most flexibility.

Adding a Path Exclusion

1] In the upper right of the Site Auditor Studio page, click Settings.

Site Auditor Studio Settings.png

2] At the bottom of the Site Settings popup, click Show Advanced Options.

Site Settings Advanced Options.png

3]  Enter the URL you would like excluded in the Block Pages From Crawler textbox. You can exclude folders, and specific files and use wildcards to exclude query strings and other, more complicated, URLs.

Block Pages From Crawler Rev 2.png

4] Click the Add button to add the excluded page(s.) The excluded pages will appear below.

5] Repeat steps 3 - 4 to add more pages to exclude.

6] You can delete an exclusion by clicking the Remove button to the right.

Remove Path Exclusion.png

7] Click the "X" in the upper right when you are finished.

Adding a Disallow Rule

8] Add this text to your robots.txt file:

User-agent: RavenCrawler
Have more questions? Submit a request