Google, Bing, and other search engines use bots to crawl our sites, but if SEO's use a bot to create content or to exchange links it's considered a form of SPAM.
Some users report that Google has brought down websites by making multiple concurrent connections and overwhelming shared hosts ability to respond to requests.
Solution? Block the bots.
For years, webmasters have wanted to send their website to as many people as possible and increase their rank in the search engines. What if a new trend is developing, a private internet, where reverse IP bans are common, where webmasters block search engines from crawling their sites.
Legitimately, Google and others do offer an option: use robots.txt to tell them how to crawl (or not crawl) your site.
Some resources on robots.txt