Need this script made in under 24 hours. Please message me if you are capable with these technologies.
Freelancers, I have a PHP coding which is I’m ising it to extract products details from amazon. And now, I want to add proxy using 3rd party proxy providers. [url removed, login to view] or [url removed, login to view] Only apply if you have previous experience in this
I will share a wsdl file. You have to create a php file naming [url removed, login to view] where you have to make only one request (GetStudent) and fetch the response. After that one has to provide that [url removed, login to view] file and once i will test it, i will release the payment. 1 hour quick work. The wsdl file is soap1.2 and has wshhtpbinding with authentication. one
I have one php based website([url removed, login to view]), here the website resource details Operating System : Linux Web Server : Apache PHP Version: 5.4+ PHP allow_url_fopen PHP GD Library PHP Multi byte String PHP CURL MySql: 5.0+ My website has bulk product upload in xml format, but I want ...
...a simple convesion and you might be waisting your time if you don't have the required skills. The HTML I need converted is for a login to a third party site (doesn't use CURL) and contains hidden fields that are essential for the login to work correctly. Here is the HTML: Client Login
I have a server with 250 IPs, but I need a script that can proxify all of those IPs and have them listen all on their own port. [url removed, login to view] [url removed, login to view] [url removed, login to view] [url removed, login to view] I sell proxies, so I will then need to create a user for each customer I have. So let's say a customer named "johnappleseed" orders 10...
I want someone to create a script and many proxy servers that automatically query a search engine of my choosing using a list of keyword inputs that I give at a set rate. This project is simply for research purposes and should be easy if you have done something like this before.
The project is for 3 forms in 3 different websites. Normaly users fill in their info and submit the forms. What I want to do is to make a program in php code and automatically fill in the info (the php code not a human) like firstname, lastname, email, phonenumber,"agree with conditions" etc (the info are different in each form), submit the form and
I am writing a Web Scrapper using Java and NodeJS. I am at a point where I want to scrape Google Search and it can be done behind Proxy only. I am looking for a sample code written in Java and NodeJS to utilize Proxy (e.g. SquidProxies) to start making requests.
Hi. ...they could be made available to the end-user but via another path. I could do the project myself but I am in a time crisis. I would suggest JAVA or Python and make it with WEB SERVICES or Python APIs. Please let me know what you can come up with. If it is required we might have a skype call in order to clarify the concepts. Thank you
... We require stratum mining proxy – concurrent/simultaneous connection to multiply pools will be open. It can be done in c++ or python – main goal is to be a stable working stratum proxy capable of relaying high volumes of hashing power Here are our functionality requirements for the stratum proxy: - The stratum proxy does not require to have getwork
I need someone to teach me how to authorize multiple user/password or authorize multiple IP (acl) on my squid3 proxy server on my Ubuntu 14.04 server with attached /24. I lmow how to set up squid3, but it wont accept multiple users or ip aithorization.
...(i will handle that part) The Script you build will have 3 functions 1st function The script you build will cycle through the database logging in to each user using CURL (Not API) accesses the account settings and copies the email address, Bio, and location. to the database associating it with that current logged in user. Now our database
I need a PHP file that can submit by CURL, to an endpoint. So 2 files: 1. [url removed, login to view] - this file will allow a user to populate fields and use curl to send via post to an enpoint ([url removed, login to view]) 2. [url removed, login to view] - this file will need to extract all of the data array fields in teh [url removed, login to view] curl post an...