Robots.txt

Robots.txt is a file that tells search engines which pages on your site to crawl and which pages not to to crawl.
You don’t want your server to be overwhelmed by Google’s crawler or to waste crawl budget crawling unimportant or similar pages on your site.
Google

Resources

Glossary Topics