If you encounter the 'page is not accessible' error on SEMrush, it could mean that the On-Page SEO Checker crawler is either blocked or unable to crawl your page. Here are some possible reasons and solutions you can try:
- Check your robots.txt file: Make sure that SEMrush's user agents are allowed to crawl your website's pages. If they are not blocked in robots.txt, you will need to allow the following IP addresses and User-agent with your hosting provider and any plugins/services you may manage your site with (such as Cloudflare or ModSecurity):
- IP addresses: 85.208.98.53, 85.208.98.0/24
- User-agent: SemrushBot-SI
- Port 80: HTTP, Port 443: HTTPS
- Site Audit bot IP address: 85.208.98.128/25 (a subnet used by Site Audit only)
- User-agent name: SiteAuditBot
- Check your crawl-delay settings: If you receive the error message 'SEMRushBot-Desktop couldn't crawl the page because it was blocked by robots.txt,' it means that your crawl-delay settings do not meet On Page SEO Checker's requirements. On Page SEO Checker crawlers only accept a crawl-delay of 1 second. Anything more than that would cause the crawler to ignore the page. To fix this issue, change the crawl-delay in your robots.txt to 1 second.
If these solutions do not work, you may want to check for other common crawlability problems such as nofollow links, redirect loops, bad site structure, or slow site speed. The SEMrush Site Audit tool can help you identify these issues.