What is the best way to do Subversion backups (on a Debian based server).
Is it to use svnadmin?
svnadmin dump /path/to/reponame > reponame.dump
Or maybe just to tar the dir where the repositories are?
tar -cvzf svn.backup.tar.gz /var/subversion/
What are the pros and cons of the above?
Thanks Johan
Update: This is a small server with only a handful of repos. So incremental backups are probably not needed, I think it is better to focus on keeping it simple.
Update: I used packs wrapper script (that in turn was a wrapper for svn-hot-backup) to do a full backup and then did a full recovery on another clean computer. However I removed that "SVN_HOTBACKUP_NUM_BACKUPS=10" part since it was not working for me.
Please note that I feel it was kind of simple and the result was very close to just tar the dir. But as Manni pointed out here to use svn-hot-backup/"svnadmin hotcopy" is a more reliable method, since tar could create corrupt backups from time to time if you are unlucky.
Look for the svn-hot-backup script. It should ship with subversion, and contains all the logic to do what you want, plus automagic rolling out of old backups. I have written the following wrapper script that uses svn-hot-backup to run as a nightly cronjob to backup a single server with multiple repositories, slightly modified to be generalized.
Have you seen the documentation on this?
Basically, you have two options:
svnadmin dump
svnadmin hotcopy
Simply making a copy of the directory is not an option because your repository might change while the copy is being made.
Whether you are into incremental or full backups depends on your amount of paranoia, the size of your repository, your needs and your infrastructure.
I'm recommending SVNBackup due to the fact that it's capable of doing incremental backups.
Why is this important? Well, if you have a large development team and you have a daily Subversion backup and your system fails 12 hours into the old backup, the entire day's work is lost.
If you do full backups (which SVN hotcopy is) many times a day you're causing unnecessary load to your repository machine, that will irritate impatient developers.
As a bonus; I also recommend Backup-PC as a backup solution. It can do incremental remote backups and is capable of saving a lot of space if you're backing up identical files on different systems.
I use svnsync to backup to an otherwise read-only repository, which is itself backed up with aged copies (day, week, month)
You can make incrementals backup with svnadmin if you wish it, you should run the hot-backup.py prior making your tar archive.
Here's a article about backing up svn repos. Anyway, reading the SVN book is a good starting point as said before.
I backup several 100GB+ svn repositories with plain old rsync.
svnadmin dump
andsvnadmin hotcopy
would take days on these repositories.One other thing to watch is
svnadmin dump
doesn't backup locks and hook scripts.Here's what I do with my repositories: use a folder backup service like Dropbox (here's a link to their Linux version). You simply make the Dropbox the root of your repository (or even above it) and it gets backed-up every time a file changes. Not only will it be available across computers, but you'll be able to access it online and have versions of it.
There are several such online backup services - most are free up to 2GB.