I need help creating a backup script for Rackspace Cloud sites.
The sites need daily backups for both the web site & their mysql databases (some sites have as many as 4 databases), the backups then need to be ftp'd off-site.
It appears that i can only achieve this via cron jobs. Rackspace show an example backup script but only show how to push files to cloudfile storage (see below). What is the FTP syntax to push the created file to a remote server?
EDIT got it working full script is:
webroot is /mnt/stor1-wc1-dfw1/381384/****.*******.com
#!/bin/sh
webroot="YOUR WEBROOT"
db_host="YOUR DB HOST"
db_user="YOUR DB USERNAME"
db_password="YOUR DB PASSWORD"
db_name="YOUR DB NAME"
#Set the date and name for the backup files
date=`date '+%F-%H-%M'`
backupname="backup.$date.tar.gz"
#Dump the mysql database
mysqldump -h $db_host -u $db_user --password="$db_password" $db_name > $webroot/db_backup.sql
#Backup Site
tar -czpvf $webroot/sitebackup.tar.gz $webroot/web/content/
#Compress DB and Site backup into one file
tar --exclude 'sitebackup' --remove-files -czpvf $webroot/$backupname $webroot/sitebackup.tar.gz $webroot/db_backup.sql
#Upload your files via FTP
ftp -n $ftphost <<END
user $user $password
bin
lcd $webroot
put $backupname
quit
END
#After your backup has been uploaded, remove the tar ball from the filesystem.
rm $webroot/$backupname
You could use lftp
Although you may be safer using lftp with sftp
Which will use passphraseless keys to authenticate and transfer the file over sftp. Either way you want to ensure that the user you are using for backup has as little privilege as possible to do the required job. With a suitably modern sshd You can lock down the user on the remote server using sftp by adding
to the /etc/ssh/sshd_config file and restarting sshd. This only allows the named user to use sftp within the /somefir tree.
The code in the question should be saved as a .sh file and then in the cloud site features pane add the cron job referencing that file name