Free SEO Tool
Check if your robots.txt file is correctly configured and not accidentally blocking search engines from crawling your important pages.
This tool fetches and analyzes your robots.txt file for common configuration mistakes. It checks whether the file is accessible, whether Googlebot or other major crawlers are blocked, whether important pages like your homepage, blog, or product pages are inadvertently blocked, whether a sitemap is declared, and whether the file follows correct syntax. It also flags wildcard rules that may be too broad.
A misconfigured robots.txt file is one of the most catastrophic SEO mistakes you can make. If Googlebot is blocked from crawling your site, your pages won't appear in search results — no matter how good your content is. Robots.txt mistakes are surprisingly common after site migrations or CMS updates, and they can take weeks to recover from once identified.
SEO is just one of 11 categories PageGrader audits. Run a full scan to see your scores across SEO, performance, accessibility, security, content quality, mobile, links, images, social sharing, AI readiness, and best practices.
Run a full website audit200+ checks across 11 categories. Free, no signup required.