I want a Google Serp Crawler in Node.js. Database is Mysql for now.
Basically the workflow is
1. Input data - Keyword|target ([url removed, login to view], [url removed, login to view] and ...) | Datacenter | Search parameters | localization | last check | intervall
2. Get the SERP
3. Extraxt data
Input data come from a Mysql table or maybe from queueing system like beanstalkd
I provide a development enviroment with test data. You get full root access to it.
Basic work is done, to get the SERP in a github project.
[url removed, login to view]
What needs to done:
Automate it - get the input data, crawl it, parse it and write it.
proxy support need to be implemented
error handling needs ro be improved
I have made a workflow cheat. Ask me and I will sent it as PM to you.
Due date is 7 days after the project is assigned to you.
Please: Place only a bid, if you have already time and you need expierence in node.js
The budget is low for the start, but it can be increased.
Project needs to maintained / updated after the start, so for monthly maintenance is a extra budget available!
After this Node.js project, I have 3 more bigger Node.js projects.
If you are good in this project, the you get the follow up projects and the monthly maintenance budget.
6 freelancers are bidding on average €584 for this job
Greetings!!! We are glad to bid on your project and would be more happy if get a chance to work on the same. Please see PMB for details. Thanks Sonali@TeamUnipixel
Hi we are freelance software developers. If you contact us, we can give a quote and we can discuss further details of the project. w w [url removed, login to view] l v e r .i o