I've written a backup script in python using tar
(via subprocess
) to do incremental backups of my files. As some of the full backups are rather big (like my picture folder) and take multiple hours to be finished (over the network to my NAS), I'm worried about what happens, when the server/PC is shutdown/rebooted during the backup.
I guess the script will get a TERM signal on shutdown/reboot. Does tar
handle the interruption of the incremental backup gracefully, so that the next call to tar
will successfully add all missing files, not retransmitting all previous files?
If it does not, what does this mean for my tar file and the snapshot file used by tar (with the --listed-incremental
option)?
The exact command I use is:
tar -vcpzf <target_file> --no-check-device -g <target_snapshot_file> <dir_to_backup>
Note: On my PC I plan to run the script via anacron, though itself checks if a backup is due when executed.
As
tar
doesn't include an index, your output tar file will be truncated ("premature end of archive") but still be usable; however as the snapshot file is generated at the end of the archival process, so it will be empty. So your next increment will actually be a complete backup.