Bash Scripting - Copy Files from Amazon S3 and Server

This project was successfully completed by freelance4hire80 for $166 USD in 5 days.

Get free quotes for a project like this
Project Budget
$30 - $250 USD
Completed In
5 days
Total Bids
Project Description

I need two bash scripts written:

1. We store many large video files on Amazon S3. I want a script written to copy a batch of files (could be 100+ files) from S3 to a folder on our linux server (which happens to be an Amazon EC2 server). I have used S3CMD before successfully but open to alternate methods. The file list would be on a separate txt file (could be csv if needed). Variables that should be able to be changed easily are source S3 bucket, and destination folder. Not sure how the concurrent number of files copying at once would work or what would be most efficient. Freelancer would need to advise on this issue. Meaning do we limit the number of concurrent files at once or ? some of the files could be 15 GB each so the copy method must include multi-part transfer (s3cmd offers this). Also need a report generated at the end of the copy that would report on successful transfers and/or files that were not located.

2. Second script is similar to #1 above but would copy from a server folder on the server that would be running the script. These would be mostly image files so not as large as the files on S3. Same variables as above (source and destination) and a report in the end that lists successes and files that cannot be found.

Not sure of the price of the budget required on this.

Completed by:

Looking to make some money?

  • Set your budget and the timeframe
  • Outline your proposal
  • Get paid for your work

Hire Freelancers who also bid on this project

    • Forbes
    • The New York Times
    • Time
    • Wall Street Journal
    • Times Online