This example robots.txt is to block most of the known bots that you might not want them to craw your website

Bots, spiders, and other crawlers hitting your website to copy or download your contents or the entire website.

They tend to send too many request within a very short time that might cause extensive resource (memory and CPU) usage and slow down your site.

robots.txt file at the root of your website. You can specific user agents in the robots.txt to ask certain bots not to crawl certain pages.

Limitations of robots.txt

All crawlers that are not kind enough to obey robots.txt . Meaning any bot can ignore your robots.txt to scan shamelessly scan your website.

That doesn't necessary means adding a robots.txt will not help you at all. There are many site scanner and copier that actually obey robots.txt, And I think it makes sense to reduce these bad traffic as less as possible.

Credit: Wikipedia Robots.txt