Now, create a 'robots.txt' file in your root directory. Copy over the text and paste it into a text file.
Robots.txt Generator generates a file which is very much opposite of the sitemap, which indicates pages to include, therefore, robots.txt syntax is very important for any website. Whenever a search engine crawls any website, it always searches for the robots.txt file at the domain root level. On recognizing, the crawler will read the file, and then identify the files and directories that can be blocked.