I have a development server on which I want to keep certain data directories in sync with the live server. Any files in these directories which exist on the live server, but not on the dev server, I need to copy over.
Because I don't care about syncing from dev-to-live or have any concerns about overwriting data on Dev, I could just copy the entire directory over with SCP. The problem is, this folder is HUGE (several GB) and I want to sync it nightly. It would never finish in time.
What's the simplest/best way (via SSH) to sync/copy missing files from the remote directory to the local directory?
Note: I'd prefer not to install any new/custom tools if I can help it; I'm also looking for something I can setup/run pretty quickly & easily, without a lot of fussing or scripty-ness! :)
Put rsync in, coach. It was made for this game.
It should do pretty much exactly what you need; big files will only transfer changed pieces, hash comparisons are done to avoid transferring unchanged files at all. You'll need some scripting around punching it through an ssh tunnel, but that should be trivial. It should be installed on most sane linux systems already, so, shouldn't be need to install extra stuff.
I have to agree with Shane. rsync. depending on the time it takes to sync the folders and the amount of storage space available you could keep a foldername & foldername.new, then you could sync the foldername.new several times a day. When it is time to go "live" you can then just copy the files from the new folder into the live folder, with either cp or a local rsync.
look into the options on rsycn as it can be tuned to handle large files faster. Also if you are moving many small files you may want to create a .tar /.tar.gz before you move them.