As per this answer to the question "Can I continue working when backup is in progress?," the default backup software in Ubuntu, Deja Dup, is prone to consistency issues when backing up. That is, if the user changes files on a writeable filesystem while the backup is being done, the backup might not be in a sensible state, and restoring it in full will restore a broken system. Even restoring individual files might restore broken files.
The advice to not do any work while the backup is going is a workaround, but not really practical. A user "not working" doesn't mean files aren't being modified behind the scenes.
The answer I linked to says this about snapshots:
This can be achieved using LVM or a newer generation filesystem such as BtrFS. This will snapshot the whole volume as if you were taking a picture of it. Changes/writes are still possible, but the backup process is being run from the read-only snapshot taken earlier.
For example, on Windows systems most backup software runs on a "shadow copy," a semantically consistent snapshot of the filesystem maintained by the OS. The user can continue working, and further modifications will not be part of the currently running backup.
However, I don't know that Deja Dup on Ubuntu uses any kind of snapshotting if it's available.
My question is: is there any automated backup software for Ubuntu that:
- is easy to use, via a GUI, suitable for users unwilling to use the command line or write their own scripts
- ensures consistency of backups using snapshots, or enforcing read-only on the source of the backup (heavy-handed but perfectly valid), or any other approach to avoid inconsistency?
Unless you are a 24/7 shop generally speaking you have the server sign everyone off at say 2am and perform backups for two hours or whatever it takes. Usually this is done in conjunction with "End of Day Processing" which rolls up detail records to Master File records, closes Month Ends, Year Ends, etc.
The basic problem is some files will "change shape" if someone is adding or deleting records. Take for example ISAM (Indexed Sequential Access Method) files where there is a raw data file and then separate Index files for each key (ie Customer Number, Phone Number, etc.). If you were to say backup the Key file for Customer Numbers first, then a user added a new customer and you backed up the raw data file you would have a data integrity error.
SQL is the popular choice these days for databases in which case an SQL dump is performed without keys backed up I think. Learning SQL (pronounced Sea-Quill) has been on my to-do list for 30+ years.
Whatever your environment you have to study backup requirements carefully and test them periodically by restoring them to a test database.
Backing up programs is generally over-kill because they can be reinstalled. The exception being your own scripts you develop.
To reiterate the safest backups are those when all users are logged out of the system. When that can't be guaranteed then professional help should be sought.