Robots.txt Multiline Validator v0.3 beta

Robots directives input


0 rows

Testing URLs input


0 URLs

Start validation


This tool is a beta version. Be careful and double check everything. Use outputs on your own responsibility.


Robot directives stats


Directives overview Overview of all directives will be displayed after validation.
Triggered
How many times directive was triggered will be displayed here after validation.

Disallowed


All Disallowed URLs by one or more directives.

Allowed


Allowed URLs as exceptions from some Disallow.

Allowed only


Specifically Allowed URLs which wouldn't be blocked by any Disallow.

None


No Disallow or Allow could be applied on these URLs.

How does it work


  • Input robots directives and URLs

    Input directives from robots.txt. Only rows beginning with Allow and Disallow are valid inputs. Next, input a list of URLs to validate against the robots.txt directives. If you need to clean up or unify your URLs (remove protocols, parameters, etc.), you should use URL Builder.

  • Validate robots.txt

    Push the Validate button and wait until the process finishes the run. It can take a while, depending on how many directives and URLs are on the input.

  • Check stats and use your validated robots.txt directives

    You'll see stats for your directives and URLs. There is a counter to how many times any directive was triggered. And there are also URLs divided into four groups based on which directive is the final that will take on effect.

FAQ & Contact


Robots.txt Multiline Validator is a tool for validating robots.txt directives with multiple URLs. It will help you test and properly validate the new robots.txt file before deployment. And it's just a simple web browser utility, not a crawler. Therefore it will not create redundant traffic on your website and mess your access logs data.

READ. SHARE. REPEAT.