I've got a server with a LOT of small files -- many millions files, and over 1.5 TB of data. I need a decent backup strategy.
Any filesystem-based backup takes too long -- just enumerating which files need to be copied takes a day.
Acronis can do a disk image in 24 hours, but fails when it tries to do a differential backup the next day.
DFS-R won't replicate a volume with this many files.
I'm starting to look at Double Take, which seems to be able to do continuous replication. Are there other solutions that can do continuous replication at a block or sector level -- not file-by-file over a WAN?
Some details:
The files are split up into about 75,000 directories.
99% of the daily change comes from adding new directories; existing files are rarely changed.
There's some other relevant discussion here.
1.5 TB would take 30 minutes with http://www.exdupe.com/ :p
... given that your disks are fast enough (exdupe is so fast that it's IO bound, not CPU bound).
And I havn't reached any limits on file count yet. Had millions too.
Edit: Ah, you need a partition/sector based backup and not file system level? It can't do that... Maybe http:// www . drivesnapshot.de/en/ is worth a shot (had to add spaces because of spam protection). Does diff backups and shadow volume copy too (no reboot).
Check out Shadowprotect. It's not continuous, but it can be set to do an incremental every 15 minutes. It's pretty awesome software. Add on the ImageManager enterprise portion and it also gives you some great replication abilities for offsite backups.
I've had great luck with Doubletake, despite the price. Their "move" product might fit the budget though...
See my answer to a similar question here.