Free SEO Tool

Free Robots.txt Checker

Check if your robots.txt file is correctly configured and not accidentally blocking search engines from crawling your important pages.

Free — no signupResults in 30 seconds200+ total checks

What this tool checks

This tool fetches and analyzes your robots.txt file for common configuration mistakes. It checks whether the file is accessible, whether Googlebot or other major crawlers are blocked, whether important pages like your homepage, blog, or product pages are inadvertently blocked, whether a sitemap is declared, and whether the file follows correct syntax. It also flags wildcard rules that may be too broad.

Why it matters

A misconfigured robots.txt file is one of the most catastrophic SEO mistakes you can make. If Googlebot is blocked from crawling your site, your pages won't appear in search results — no matter how good your content is. Robots.txt mistakes are surprisingly common after site migrations or CMS updates, and they can take weeks to recover from once identified.

Common issues we find

  • Disallow: / blocking all crawlers, removing the site from search entirely
  • Accidentally blocking the /wp-admin or CMS path that other important assets depend on
  • No sitemap declaration, making it harder for search engines to discover all pages
  • Blocking CSS and JavaScript files, preventing Google from rendering pages correctly
  • Overly broad wildcard rules matching URLs that should be indexed

Get the full picture

SEO is just one of 11 categories PageGrader audits. Run a full scan to see your scores across SEO, performance, accessibility, security, content quality, mobile, links, images, social sharing, AI readiness, and best practices.

Run a full website audit

200+ checks across 11 categories. Free, no signup required.