Create a simple or complex robots.txt file to manage web crawler access.
User-agent:
Allow
Disallow
+ Add Rule
Sitemap URL:
Generated robots.txt
Copied to clipboard!