Robots.txt Generator

Tell search engine robots how to crawl & index pages on a website

Default - All Robots are:  
     
Crawl-Delay:
     
Sitemap: (leave blank if you don't have) 
     
Search & Crawlers: Google
  Google Image
  Bing
  DuckDuckGo
  Applebot
  Baidu
  Yandex
  Naver
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   

Advanced (Optional)
Useful for exceptions (for example allow a path/file while other paths are blocked).
Create extra groups with custom User-agent and optional Allow/Disallow rules.



Robots.txt file content

Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.

Let us change our traditional attitude to the construction of programs: Instead of imagining that our main task is to instruct a computer what to do, let us concentrate rather on explaining to human beings what we want a computer to do.

Donald E. Knuth

CodersTool Categories