I have a large traffic website which has like 1000 GB user content (images, music and videos mostly), which is currently residing on a single server.
I need to find a way to distribute all this content over several FTP accounts, in such way that:
1) they would all end up mounted under a directory
2) files are stored with replicas, so that if one account goes boom, nothing is lost or temporarily unavailable.
3) the existing website code will need to access the files to make edits/deletions as if everything was local
4) we need to be able to link to the content so that http users will download directly from the sites (over http) instead of our current server.
In other words, i need a distributed FTP file system.
I have been playing with curlftpfs and fuse but wasn't able to make anything work.
Please no junk bids. Budget is $100. Please include your linux / distributed computing / distributed storage / similar experience in your bid.