Robots.txt Generator
Runs in browserGenerate and validate your robots.txt file with a form builder, CMS presets, and instant error checking.
Last updated 02 Apr 2026
Build a complete robots.txt file using our free generator. Add User-agent rules, Allow and Disallow directives, Sitemap URLs, and Crawl-delay settings. Use CMS presets for WordPress, Shopify, or Next.js. Validate any existing robots.txt for syntax errors, conflicting rules, and missing sitemaps — instant, browser-based.
CMS Presets
Rules
Note: Googlebot ignores Crawl-delay. Use Google Search Console instead.
Sitemap URLs
Preview
How to use
- 1
Add crawl rules
Choose a User-agent (or use * for all bots) and add Allow or Disallow directives for specific paths like /admin/ or /checkout/.
- 2
Apply a CMS preset
Click a preset button for WordPress, Shopify, or Next.js to auto-fill common crawl rules for that platform, including admin paths and duplicate content patterns.
- 3
Add your sitemap URL
Enter your sitemap URL (e.g. https://example.com/sitemap.xml) so search engines can find and index your content efficiently.
- 4
Review the live preview
The generated robots.txt file updates in real time as you make changes. Check it carefully before copying.
- 5
Copy or download
Click Copy to clipboard or Download as robots.txt and upload the file to the root of your domain.
Frequently asked questions
What is a robots.txt file?
Which User-agent should I use?
Does Disallow remove pages from Google's index?
What is Crawl-delay?
Should I block my admin and login pages?
What does the validator check for?
What is the difference between Allow and Disallow?
Is my robots.txt file sent to a server?
How do I block AI training bots?
Your robots.txt file is the first thing search engine crawlers read when they
visit your site. Getting it wrong can block your entire site from Google or
waste crawl budget on pages that should never be indexed.
This tool gives you a visual form builder for robots.txt — no need to memorise
directives or worry about syntax. Add rules for any search engine crawler (or
use * for all bots), set Allow and Disallow path patterns, declare your Sitemap
URL, and apply Crawl-delay settings. One-click CMS presets auto-fill sensible
defaults for WordPress, Shopify, and Next.js, covering admin paths, login pages,
duplicate content patterns, and bot-specific rules for GPTBot and other AI
crawlers.
The Validate mode lets you paste an existing robots.txt and check it for syntax
errors, conflicting Allow/Disallow rules, missing Sitemap directives, overly
aggressive Crawl-delay values, and unknown directives. All checking runs in your
browser with no file upload required.
Who is this for? SEO specialists auditing crawl settings, developers deploying
new sites, e-commerce managers protecting admin and checkout paths, and content
teams managing crawl budget on large sites.
Related tools
Meta Tag Previewer
Preview how your page looks in Google, Facebook, Twitter, and LinkedIn before you publish. Generate ready-to-paste meta tags.
Schema Markup Generator
Generate and validate JSON-LD structured data for 10 schema.org types. Unlock rich results in Google search — free, browser-based.
XML Sitemap Generator
Generate sitemap.xml from a URL list or validate your existing sitemap — checks priority, changefreq, lastmod, and duplicates.
Regex Tester
Test regular expressions live with colour-coded match highlighting, capture groups, replace mode, and common presets.