Modifying httpd.conf Settings

chaddro

New Member
Yesterday when my server hit 100% (again) I checked top through ssh and I had about a dozen httpd threads running all over 12%.

I checked to see what was going on and it turns out google, yahoo, msn and another bot all decided to crawl my site at the same time :eek:!

I waited till it looked like they were done, then rebooted the server (is there a better way to clear things?) and that put the server back at 75% memory usage with httpd threads using about 7% memory.

So.... I am wondering if anyone has experience is optimizing settings in http.conf. I am making headway in tuning mysql using the tuning-primer.sh script. So I figure working on httpd.conf is next.

-cj
 

Steve B

New Member
Since the problem seems to be the amount of bots that are spidering your sites at the same time why not slow them down a little to ease the load. I believe this can be done through robots.txt file by adding something like...

Crawl-delay: xx

Where xx is seconds delay between crawls.
]
Be aware that not all spiders obey this command but the likes of Google,Yahoo and MSN should.
 
Top