I would need to parse some data from about 30 websites and import it into a spread sheet (excel, libre office calc etc) for further use. The websites are mostly HTML (supposedly often rendered from php). Some websites need an input first to generate the website which should be parsed.
Roughly 5000 data records have to be parsed from these 30 websites (no copyright infringement involved).
So far I don't care in which program language it will be done, it just has to be a program or script which goes to the specific domains www . something . com and grabs the data which I want and saves it. The saved data records eventually have to be sorted or its data format has to be adjusted that all data records have the same logic.
To be clear: the project should deliver the final spreadsheet with all the data records, not the program or script which does the job.
I would prefer this small project done by someone who has a broader understanding of creating a database, storage management, order processing, barcodes, EDI (856/ASN) and logistic & shipping. The idea is that whoever gets the job and does a good work could be in for more sophisticated projects (linux/ubuntu).
For more information or any questions, please send a PM, I'll try to answer asap that you get a better picture what this is all about and gives you a good basis to make a reasonable bid!
16 freelancers are bidding on average $377 for this job
Consider it Done..... Right developer at your door...Hope to achieve next positive feedback from you !!! Please check you inbox. "Always willing to walk extra miles to achieve Excellence"
Have build my own scraping framework for parsing metadata from websites into XML database, with Excel export et al. Can be done fast. Data will be high quality.