We scraped thousand of data per second in node.js and post them to node.js server
I want better architect on my server. I want to handle to multiple hundred of request properly and supply them to mobile client as per needs.
1. It received those curl/post request from API and insert into MongoDB or MySQL Runtime using Node,js scripts..Redis a better option here?
We received more than thousand request a second, which mechanism handle that with best architectural technology, let me know your thoughts?
Open with any other solutions.
2. [url removed, login to view] required on DataGrabber app when new row inserted and we emit specific data say last recent 50 rows to some other client app
5 freelancers are bidding on average $699 for this job
Greetings!!! We are glad to bid on your project and would be more happy if get a chance to work on the same. Please see PMB for details. Thanks Sonali@TeamUnipixel