I want to crawl a single site and do not want the site to realize how many times we are sending requests to it. We have a single server and I would like to run multiple crawlers (e.g., 5+ crawlers) using different IP addresses for each AND have the IP addresses used by the site changed every minute. Further, I would like the search history for each of the IP addresses to have a clean search history.
I am open to using a commercial software package (e.g., [url removed, login to view]) or would be open to using a strategy involving buying many IP addresses and rotating them.
We have a single server but it would have to appear to the site that we are making requests from that we have 5+ different computers that stop after 1 minute, but actually we are having the crawlers run constantly.