I'm planning to back up multiple websites to Glacier at midnight each day.
My current system is to create a tar.gz of the whole of /var/www (i.e. all the sites), and to copy that to a different server.
I'm assuming I would use these backups if the whole server failed, though perhaps I'd occasionally want to retrieve individual files.
Should I continue in the same way (tar.gz of all per day in one Vault), or have different tar.gz per site in the same vault, or different tar.gz per site each in a vault per site?
The "best of both worlds" approach would be the following:
By doing it this way, you will have a decent "instantly ready" backup in S3 for the last weeks, with much more advanced copy tools (sorry, Glacier) and you will benefit from the lower costs of Glacier and the S3 interface to retrieve items, much friendlier than Glacier's.