I am currently setting up a website including a CMS which has millions of assets. Smaller and bigger pictures mainly. I would like to backup these files on a regular basis. For example weekly. I have an FTP mounted to my machine to which I can copy to. Yesterday I thought about using rsync and simply copy only the new files. But it seems to take a bit long for rsync to crawl through all the files. Basically it takes ages because the FTP storage is shitty. So I ended up with the idea to find only the latest modified (last 24h) files in my assets directory and copy them to the FTP. To minimize the load on the FTP. But I am new to bash scripting, of course google helped me already and I came up with the following parts:
#!/bin/bash
Source="/my/source/folder"
Destination="/my/slow/ftp/"
ls find $Source -mtime -1 -ls
do
cp -a $Source $Destination
done
What am I missing? Can you help me to finish it?
If my idea of backing up the delta is not optimal, feel free to suggest something else.
You can do this using one go of
find
.For tasks like this where time precision is necessary, use
-mmin
option offind
to express the time constraint in minutes instead of-mtime
to express in days.This will copy files, modified within the last 24 hours (1440 minutes) counting from now, from
/source
to/destination
.To copy to remote server, use
rsync
, as it will resume any partial transfer (unlikescp
):