Robots.txt Tester — SEO & Marketing Tools — Zapit

Robots.txt Tester

Test whether URLs are allowed or blocked by robots.txt rules.

Detected sitemaps (1)

https://example.com/sitemap.xml

Test a URL

Allowed

Tested path:/admin/settings
Bot:Googlebot
Matched rule:No matching rule — allowed by default
Under block:User-agent: Googlebot

Parsed rule blocks (2)

User-agent: *

Allow: /admin/public/

Disallow: /admin/

Disallow: /private/

User-agent: Googlebot

Disallow: /no-google/

Paste your robots.txt content, enter a URL or path, and instantly see whether a specific crawler is allowed or blocked. Visualizes parsed rule blocks and shows exactly which rule triggered the result.

Key Features

Real parser
Parses User-agent blocks, Allow and Disallow directives following the RFC-compliant most-specific-path-wins rule used by Googlebot.
Bot selector
Test against common bots (Googlebot, Bingbot, Baiduspider, DuckDuckBot, *) or enter any custom user-agent string.
Rule explanation
When a URL is blocked or allowed, the tool shows exactly which rule matched and which User-agent block it came from.
Sitemap detection
Automatically detects and lists any Sitemap: directives found in the robots.txt content.

How to Use

  1. 1Paste your robots.txt content into the left panel (or click Load sample).
  2. 2Enter the URL or path you want to test.
  3. 3Select the crawling bot from the quick buttons.
  4. 4The result panel shows Allowed/Blocked with the matching rule.

Frequently Asked Questions