A file that tells search engines how to crawl and index the pages on your site. It’s important because it helps as search engines crawl your site and index content to serve users looking for that information. Search engines will look for a robots.txt file before crawling your site to see if there are any instructions. Even though your robots.txt file might instruct a search engine not to crawl a page, it can’t actually prevent it from being indexed. To do that, you’ll want to use noindex, nofollow, nosnippet, noimageindex, or noarchive directives.