Knowledge Base
Semrush Toolkits
SEO
On Page SEO Checker
Why did I get a “page is not accessible” note for some of my pages?

Why did I get a “page is not accessible” note for some of my pages?

This message will appear if the On Page SEO Checker crawler was blocked or unable to crawl your page. Please check the robots.txt file on your website to make sure that it allows our user agents to crawl its pages.

If our bots are not blocked in robots.txt, to fix the issue you need to whitelist the following IP addresses and User-agent with your hosting provider and any plugins/services you may manage your site with (i.e Cloudflare, ModSecurity):

85.208.98.53
85.208.98.0/24
User-agent: SemrushBot-SI

To specify the Port, use one of the following options:

Port 80: HTTP
Port 443: HTTPS

Additionally, you should also whitelist the Site Audit bot which is used to crawl pages at the following IP address:

85.208.98.128/25 (a subnet used by Site Audit only)
User-agent name: SiteAuditBot

If you receive the error message "SEMRushBot-Desktop couldn't crawl the page because it was blocked by robots.txt," the crawl-delay settings within your robots.txt do not comply with On Page SEO Checker. 

On Page SEO Checker crawlers only accept crawl-delay of 1 second. Anything above this value would make the crawler ignore the page. This may result in the pop-up message, as well as only a few optimization ideas being given to the specific landing page.

To fix this issue, change the crawl-delay within your robots.txt to 1 second. 

 

Frequently asked questions Show more
Manual Show more
Workflows Show more