As a measure to mitigate against catastrophes such as a malicious entity gaining full access to our AWS account and deleting everything, I am in the process of setting up offsite backups for our production database. We use PostgreSQL on AWS RDS. We have daily backups enabled on RDS.
I came up with the two following options:
1) (daily) Download of the database snapshot generated by RDS (though I'm not sure if this is scriptable or even possible)
2) (daily) Spin up a read replica and use pg_dump on that replica to backup the database
What would be the best way to achieve this?
If that's any help we already have a tool that does daily backups of our s3 buckets.
One option you have is to copy your RDS snapshots to another AWS account. This is a commonly-used practice.
This can be done by:
https://aws.amazon.com/blogs/aws/amazon-rds-update-cross-account-snapshot-sharing/
I also wanted an offsite backup of my AWS RDS, and the code, logs, etc. I do this by:
I mean to document this all some time, but it's really pretty simple to work out.
2017 Update
I stopped using Attic as it's not under active development or support. I use Borg Backup for deduplicated backups, which is a fork of Attic.
The Borg backups are stored on my file system. I sync those files to AWS S3 using S3sync. If I was setting it up now I'd consider using S3FS to write the backups directly to S3, though S3FS isn't a full file system so it might not work so well for this case.