Reduce server load by blocking bad bots (and help stop spammers and sploggers)
If you're like me you are probably on a shared hosted server because
- You don't have a static IP or extra computer from having your sites server from home
- You can't afford a dedicated server.
I've noticed on my second hosting account, HostGator, that I was using many files, but many more than I really have in my website folders on my computer and I realized that those extra files were mostly from my website cache. The cache files are created when a dynamic page loads so the next time it is called the cache is used instead of calling the database for everything.
What is a Bad Bot?
I had read in several forums that it is a good idea to block "bad bots" bad bots are internet spiders that read and copy your content for Splogs (Spam Blogs) or to steal e-mail addresses to spam. Obviously neither are good, so I ...
- Added my robots.txt
- A few days later added my trap which points to a path blocked in my robots.txt
How does a Bad Bot Trap work?
You add some code to your pages to a page that is blocked in robots.txt but you hide the link from human visitors. When a "bad bot" visits your website it will ignore your robots.txt instructions and follow the link. When it lands on your forbidden page, your script registers its IP address in your .htaccess file to deny access to your website.
Does a Bad Bot Trap block all Splog and Spam Bots?
While a bad bot trap does not block all spammers and sploggers using bots, it does get rid of many. Spammers and sploggers aren't very ethical so most never bother to update their bots to check your robots.txt file to see what is allowed or not.
Because I had added some code to e-mail me whenever a bad bot was blocked, I got regular notices. I found many sites sending me an e-mail per day about a bad bot being blocked!
About a week after adding my bot trap, I needed to go to my Hostgator control panel to open Phpmyadmin to make some changes to a database and I noticed the sidebar showing a huge drop in files.
Although I have been trying to eliminate unnecessary files and I have combined a few include files into one with functions, It wouldn't explain the huge drop I saw.
I don't have exact numbers, but my files according to Hostgator went down by about one third. The only explaination I can think of is that blocking bad bots kept many pages from being loaded into cache. That saves plenty of space and significantly reduces server load without hurting real visitors or reducing features.
Before you upgrade your hosting account, add a bot trap, perhaps it will be enough.