Run multiple crawlers with different IP addresses from Single Server

I want to crawl a single site and do not want the site to realize how many times we are sending requests to it. We have a single server and I would like to run multiple crawlers (e.g., 5+ crawlers) using different IP addresses for each AND have the IP addresses used by the site changed every minute. Further, I would like the search history for each of the IP addresses to have a clean search history.

I am open to using a commercial software package (e.g., [url removed, login to view]) or would be open to using a strategy involving buying many IP addresses and rotating them.

We have a single server but it would have to appear to the site that we are making requests from that we have 5+ different computers that stop after 1 minute, but actually we are having the crawlers run constantly.

Skills: Linux, Python

See more: hide my search, run software different, run, multiple], ip, DIFFERENT, different ip, crawl a we, python linux server, commercial software, software multiple, search python, python requests, used computers, multiple linux, clean linux, multiple search, python search, linux clean, multiple addresses, linux multiple, clean linux server, python multiple, multiple times different, software python

About the Employer:
( 46 reviews ) San Francisco, United States

Project ID: #196478

5 freelancers are bidding on average $230 for this job


hi,pls chk pm.

$249 USD in 3 days
(5 Reviews)

We have a very intelligent & reliable solution for this. Please check PM.

$200 USD in 1 day
(7 Reviews)

we are experienced crawler coders. please check our previous projects and portfolio.

$250 USD in 7 days
(3 Reviews)

Hello. I already have multi-threading spider. please read pm.

$150 USD in 1 day
(2 Reviews)

Hi, Check PMS

$300 USD in 10 days
(0 Reviews)