MagicTools
Back

Robots.txt Generator

Generate robots.txt rules with allow, disallow, host, and sitemap directives.

Rules

Generate a deploy-ready robots.txt file.

robots.txt output

How to use / Why use this tool / FAQ

How to use

Set the user-agent, then add Allow and Disallow rules line by line. Optionally add a sitemap URL and host value, then copy the generated robots.txt output.

Why use this tool

A valid robots.txt file helps search engines understand which paths they should or should not crawl. This tool speeds up SEO setup and technical audits.

FAQ

What does robots.txt do?
robots.txt gives crawling instructions to bots such as Googlebot. It does not guarantee privacy or indexing outcomes by itself.
Should I add a sitemap URL?
Yes. Adding your sitemap URL is a common best practice because it helps search engines discover pages faster.