I have a number of large (100GB-400GB) files stored on various EBS volumes in AWS. I need to have local copies of these files for offline use. I am wary to attempt to scp such files down from AWS considering their size. I’ve considered cutting the files up into smaller pieces and reassembling them once they all successfully arrive. But I wonder if there is a better way. Any thoughts?
There are multiple ways, here are some:
- Copy your files to S3 and download them from there. S3 has a lot more support in the backend for downloading files (It’s handled by Amazon)
- Use rsync instead of scp. rsync is a bit more reliable than scp and you can resume your downloads.
rsync -azv remote-ec2-machine:/dir/iwant/to/copy /dir/where/iwant/to/put/the/files
- Create a private torrent for your files. If your using Linux
mktorrentis a good utility you can use: http://mktorrent.sourceforge.net/