I'm talking about a dataset containing hundreds of DBF-format data files in a shared location, some or all of which may be open or locked by multiple network users, and Microsoft's Data Protection Manager replicating every two hours.
Can the 'snapshots' in this situation be relied upon ?
Personally, I wouldn't count on your snapshots being good. Depending on the application's access pattern with respect to the data files you might get lucky, but I wouldn't be the farm on it either.
Any time you're taking snapshots of data from an application that isn't aware of the snapshotting (which would be every "shared file database" application) you run the risk of getting an inconsistent copy.
Sure, you'll get exactly what's on the disk at the time of the snapshot, but you have no idea if one or more users' instances of the application were midway through the process of updating the data. There is no mechanism, being that there's no server-side database engine, to instruct all the clients to bring the files they have open to a consistent state. Certainly, the underlying OS on the file server computer will quiesce I/O to the filesystem before taking the snapshot, but you have no idea what kind of stupidity the applications themselves are doing (holding unwritten data in memory on the client, etc).
If you really want to be safe, regularly take backups of your DBF files when they're not in use. Take snapshots too, if you want, and you may get lucky, but at least you'll have your "not in use" backups to fall back on if the snapshots turn out to be crap.
For what it's worth, we use DoubleTake for real-time (or close to real-time) backups of shared flat files.
It's hideously expensive, but it was the only reliable option we found. We had about 20Gb of Pervasive Btrieve files.
Since we made the move to SQL Server, we've got very few flat files left and most of them are now unlocked most of the time, and we found simple DFS was sufficient for making sure they were backed up regularly.