Skip to content

About the Robots.txt Generator

Robots.txt Generator builds a perfectly-formatted robots.txt file you can drop into the root of any website to tell search-engine crawlers what to index and what to skip. Pick a default policy (Allow all or Disallow all), add Disallow and Allow paths, declare one or more Sitemap URLs and — if you need finer control — append custom User-agent blocks for Googlebot, Bingbot or AhrefsBot. The tool emits the exact syntax of the Robots Exclusion Protocol so you avoid the typos that silently break crawl rules. Copy the output to your clipboard or download it as a .txt file ready for upload.

How to use

  1. Choose a default policy — Allow all or Disallow all.
  2. Add Disallow paths and (optionally) Allow paths, one per line.
  3. List your Sitemap URL(s) and any custom User-agent blocks.
  4. Click Generate, then copy or download robots.txt.

Benefits & key features

  • Produces spec-correct robots.txt every time — no hand-written typos.
  • Handles custom user-agents, crawl-delay and multiple sitemaps.
  • Copy-to-clipboard and direct download supported.
  • Everything stays in your browser — no upload, no account.

Pro tip

Upload the generated file to your domain root (/robots.txt) and test it in Google Search Console's robots.txt tester before going live.

Open Robots.txt Generator now