I have a rather big directory on one server (over 4000 files), which I'd like to copy to another server (which contains a previous version of this directory). rsync
is the first option, but it will put the destination folder into waiting status for a rather long period of time (more than a minute).
I'd like to do it a bit differently:
gzip
the source folderscp
the archive to the destination servergunzip
the file there- delete the archive at the source and the destination
What is the best way to accomplish all this?
rsync
has--delay-updates
which seems to be what you need:«…
…»
The fastest way, if you have the space, is to
rsync
twice. Keep two copies of the files on the destination machine. First do a remotersync
to update the inactive copy. Then do a localrsync
to update the active copy from the inactive copy.rsync to a cold copy, then just change a symlink and delete the former active copy
Stop the web service on the target server for 10 minutes, do the update in any way you feel is sensible (rsync is fine), then start the web service again.
1-2-3:
tar -c -O <source folder> | gzip -c | ssh <destination server> "gzip -d | tar -x "
4:
rm <source folder>