Robots directives input
Testing URLs input
This tool is a beta version. Be careful and double check everything. Use outputs on your own responsibility.
Robot directives stats
How many times directive was triggered will be displayed here after validation.
How does it work
Input robots directives and URLs
Input directives from robots.txt. Only rows beginning with Allow and Disallow are valid inputs. Next, input a list of URLs to validate against the robots.txt directives. If you need to clean up or unify your URLs (remove protocols, parameters, etc.), you should use URL Builder.
Push the Validate button and wait until the process finishes the run. It can take a while, depending on how many directives and URLs are on the input.
Check stats and use your validated robots.txt directives
You'll see stats for your directives and URLs. There is a counter to how many times any directive was triggered. And there are also URLs divided into four groups based on which directive is the final that will take on effect.
FAQ & Contact
Robots.txt Multiline Validator is a tool for validating robots.txt directives with multiple URLs. It will help you test and properly validate the new robots.txt file before deployment. And it's just a simple web browser utility, not a crawler. Therefore it will not create redundant traffic on your website and mess your access logs data.