Robots.txt Generator
Build a spec-correct robots.txt with allow / disallow rules per user-agent and sitemap.
Build a spec-correct robots.txt with allow / disallow rules per user-agent and sitemap.
Robots.txt Generator builds a perfectly-formatted robots.txt file you can drop into the root of any website to tell search-engine crawlers what to index and what to skip. Pick a default policy (Allow all or Disallow all), add Disallow and Allow paths, declare one or more Sitemap URLs and — if you need finer control — append custom User-agent blocks for Googlebot, Bingbot or AhrefsBot. The tool emits the exact syntax of the Robots Exclusion Protocol so you avoid the typos that silently break crawl rules. Copy the output to your clipboard or download it as a .txt file ready for upload.
robots.txt.Upload the generated file to your domain root (/robots.txt) and test it in Google Search Console's robots.txt tester before going live.