We have a huge dataset (around 10GB after unzipping). We want to host it on an MS SQL server.
- We need help to set it up to begin with. Our vendor uploads the data on his secure site. We need a robust process to check for the updates, download the file, unzip it and then update the SQL table. The data gets delivered multiple times a month. The process should be such that it has to constantly look for a complete file posted on the vendor's site and then start the rest of the processes.
- The data needs to be indexed for faster querying.
- Provide us with MS SQL Server management studio interface to this server.
- We want the standard techniques to be used in the whole process so that we can manage things on our own once setup.
- We need support to debug any issue we might face - but we don't expect any.
26 freelancers are bidding on average $602 for this job
I am interested in the project and I have handled large datasets in the past. I need some clarifications and will send you message with my queries. Thanks for looking!
I have extensive experience that meets the qualifications, and would be happy to complete this work for you. We can speak via telephone, skype, yahoo im, aim, msn messenger, or irc chat. Thank you, Sean Brady
Hi npradeep, I have developed something similar to this, though with smaller scale, but I am confident in getting your project fulfilled. Sincerely yours, John
I have over 20 years experience working with Microsoft SQL Server. based in the UK, New Zealand and Australia. I have experience with databases ranging in size from 10gb though to 4T
ETL (Extract Transformation and Load) processes expert, more than 5 years developing this kind of solutions (fully automated) with MS SQL Server, project completion guaranteed.