Robots.txt Generator

Generate a valid robots.txt file instantly. Configure User-agent rules, Allow/Disallow paths, and Sitemap URL — no sign-up required.

100% Client-Side Your data never leaves your browser Free · No Sign-Up
robots.txt
 

How to Use

  1. Choose a User-agent from the dropdown (* targets all bots) or type a custom bot name.
  2. Click + Allow or + Disallow to add path rules.
  3. Optionally add more User-agent blocks for per-bot configuration.
  4. Enter your Sitemap URL if you have one.
  5. The robots.txt output updates in real time — click Copy to copy it.

Common robots.txt Patterns

  • Block all bots: User-agent: * + Disallow: /
  • Allow all bots: User-agent: * + Disallow: (empty value)
  • Block a specific directory: Disallow: /admin/
  • Allow Googlebot only: Block *, then add a separate Googlebot block with Disallow:

After generating your file, upload it to your web server root so it is accessible at https://yourdomain.com/robots.txt.

FAQ

What is a robots.txt file?

robots.txt is a plain-text file placed at the root of a website (e.g. https://example.com/robots.txt) that tells web crawlers which pages or directories they are allowed or not allowed to access. It follows the Robots Exclusion Protocol.

Does robots.txt block all bots?

No. robots.txt is a convention, not a security measure. Well-behaved crawlers like Googlebot and Bingbot respect it, but malicious bots may ignore it entirely. Use server-level access controls to truly restrict content.

What does 'Disallow: /' mean?

It tells the specified User-agent to not crawl any page on the site. Using 'User-agent: *' combined with 'Disallow: /' blocks all compliant bots from the entire site.

Can I have multiple User-agent blocks?

Yes. You can define separate rule sets for different crawlers. Click '+ Add User-agent Block' to add more blocks. Each block can have its own Allow/Disallow rules.

Where should I put the Sitemap directive?

The Sitemap directive should appear at the end of your robots.txt file. It tells search engines where to find your XML sitemap, which helps them discover all pages you want indexed.