robots.txt is a file that can be placed in the root folder of your
website to help search engines index your site more appropriately.
Search engines such as Google use website crawlers, or robots that
review all the content on your website. There may be parts of your
website that you do not want them to crawl to include in user search
results, such as admin page. You can add these pages to the file to be
explicitly ignored. Robots.txt files use something called the Robots
Exclusion Protocol. This website will easily generate the file for you
with inputs of pages to be excluded.
Copyright © , SMH Soft