robots.txt Online Generator
简体中文
简体中文
繁體中文
English
Operating Instructions
Generate robots.txt, output only explicitly configured crawlers
What is robots.txt file
- robots.txt (all lowercase) is a text file placed in the root directory of a website to tell search engine crawlers which pages can be crawled and which cannot...
- Because URLs are case-sensitive in some systems, the filename of robots.txt should be uniformly lowercase...
- If you want to define the behavior of search engine crawlers when accessing subdirectories separately...
- The robots.txt protocol is not a specification, but a convention...
- Robots protocol is a widely accepted moral code in the international internet community...
robots.txt file content
- Whether search engine spiders are accessible or crawlable.
- Accessibility of search engine spiders to directories or files.
- Definition of website sitemap path.
- Crawl delay limit for search engine spiders.
About robots.txt file generator
- Set the data to be configured through the web interface, click the generate button to get the content of robots.txt in the text input box below.
- Now, create a blank text file, name it "robots.txt", then copy and paste the above content into "robots.txt".
- Place "robots.txt" in the root directory of your website, access robots.txt to ensure it is accessible to visitors (such as search engines).
Popular Tools
- Developer Tools
- JSON Tools
- Image Processing
- Doc Processing
- Network Tools
- Practical Tools
- Text Processing
- Unit Conversion