I want to identify and block all non-human visors from my websites. (crawlers, clickbots, spambots, spybots, etc).
Preferably a script (all my sites run PHP) checks the user agent or whatever needs to be checked and redirects non-human traffic to hell:)
If you are bidding for this project please explain how you're planning to make sure that my sites only receive human traffic.
You must have experience with clickbots - the PPC ads on my sites are being clicked by bots and the advertisers are pretty mad because of this so I want to make sure this doesn't happen.
5 freelancers are bidding on average $153 for this job
I would use a small script to keep known crawlers and bots in a db, then use htaccess to divert the bots from the content you are trying to protect. Mike [url removed, login to view]