Please think first to set the .pgpass file, that contain passwords to be used if the connection requires a password.
This file should have lines of the following format:
hostname:port:database:username:password
And each of the first four fields may be a literal value, or *, which matches anything. For example: *:*:*:postgres:pg_password.
This .pgpass file must reside in the home directory ~/ and the permissions on it must disallow any access to world or group; achieve this by the command
chmod 0600 ~/.pgpass.
Try AutoPostgreSQLBackup. It is a single script file, can be easily configured to your needs, does daily, weekly and monthly scheduling, logs per email, log file or stdout, etc.
If it's a reasonably small database, and such low requirements on the backup as just once a day, just run pg_dump from cron to dump to a local file, and then use whatever you have to backup the files on the machine to archive the dump away.
Try astrails-safe. It knows how to backup mysql (mysqldump), postgres (pg_dump), or just plain files (tar) with encryption (gnupg) and upload to S3/Sftp.
This is setup for OSX but simply change the Program paths and it will work fine.
Backs up to /sqlbackups
The script will return the directory size/directory and has breakpoints that will return a non zero status if it fails.
I used it in combination with pgAgent to do daily backups.
Run pg_dumpall from cron.
You can use
pg_dump
like this:Please think first to set the
.pgpass
file, that contain passwords to be used if the connection requires a password. This file should have lines of the following format:And each of the first four fields may be a literal value, or *, which matches anything. For example:
*:*:*:postgres:pg_password
.This
.pgpass
file must reside in the home directory ~/ and the permissions on it must disallow any access to world or group; achieve this by the commandchmod 0600 ~/.pgpass
.Try AutoPostgreSQLBackup. It is a single script file, can be easily configured to your needs, does daily, weekly and monthly scheduling, logs per email, log file or stdout, etc.
If it's a reasonably small database, and such low requirements on the backup as just once a day, just run pg_dump from cron to dump to a local file, and then use whatever you have to backup the files on the machine to archive the dump away.
pg_rman is a new tool, offering incremental backups, which works for PostgreSQL 8.4 or newer.
Try astrails-safe. It knows how to backup mysql (mysqldump), postgres (pg_dump), or just plain files (tar) with encryption (gnupg) and upload to S3/Sftp.
pg_dump is a nice solution, but if you are trying to backup a lot of data, perhaps this may help:
http://www.postgresql.org/docs/8.1/static/backup-online.html
which in fact is a kind of 'raw' logging, but that can be useful as an incremental backup method...
This is a script that will backup each database individually as well as the often forget but important PostgreSQL GLOBALS and user login info.
The importance of this is to take advantage of compression that pg_dumpall does not provide and the forgoten data that pg_dump ignores.
This will require a pgpass or similar setup as described here http://wiki.postgresql.org/wiki/Pgpass
This is setup for OSX but simply change the Program paths and it will work fine.
Backs up to /sqlbackups The script will return the directory size/directory and has breakpoints that will return a non zero status if it fails. I used it in combination with pgAgent to do daily backups.
Script redacted, sorry about that :(
Why settle down with a daily backup when you can easily have point-in-time recovery with barman?
As others have said: pg_dumpall.
Also, take a look at log shipping. Then you can get more point-in-time backups which you can play back: http://www.postgresql.org/docs/8.3/static/runtime-config-wal.html
Or how about the section on backups in the manual:
http://www.postgresql.org/docs/8.3/static/backup.html