You can try using a robots.txt file in the home dir of the site. This can be used to either disallow bots from certain parts of the site (use Disallow) or to slow them down when they crawl (use Crawl-delay and/or Request-rate).

You can also set specific times of day to be crawled with Visit-time.

Not all bots respect robots.txt while others only obey certain comands in it - but it's worth a shot if you're being hit by one of the bigger search engine bots, they are usually quite well behaved.


Handy robots.txt file generator: