I have a +150Gb FILESTREAM enabled database growing steadily +10Gb weekly, and am curious if anyone has any recommendations | familiarity regarding compression ratios for database backups.
I tried Hyperbac, as I have previously worked with this 3rd-party software, but was greatly disappointed in the compression ratio achieved for the FILESTREAM data. Most of the backup is consumed by the FILESTREAM data.
Sincerely, Sean Fitzgerald
Compression ratios depend highly on the data contents. If you're storing things like compressed audio (mp3 or other), videos, or images, chances are they you'll get pretty poor compression ratios in your backups - the data is already compressed "natively".
"Normal" database backups often have relatively high compression ratios (comparing datafile size to backup file size) because the database's storage engine is optimized for speed rather than space. That means there's a lot of "wasted" space in the datafiles, so it compresses well.
If you had a database containing only blobs of already compressed data, the backup compression ration would also be very poor.