I have 2 sets of data in separate sites that are mirrors of each other. The 2nd set is a clone of the 1st set. However, the 1st set will be the live data and therefore will be changing going forward. We need to setup DFS between the two, with the second set acting as a receive only. This is fine, we can do that. My query is, when I initiate DFS, is it going to try and re-copy everything over again, or will it realise that most stuff already exists and is the same and therefore no need to copy / change it? Thanks
If your data truly IS identical, it will not. If you have slight differences, it will. DFS calculates a hash of the file to determine if replication needs to occur.
You can use the
dfsrdiag
tool to check a few files on either side to determine if they calculate the same hash. If they both come out to have the same hash, then DFS will not try to re-copy the data over. Keep in mind that since it uses a hash, things like security permissions and last access times, etc will all be included in that calculation.E.g.
In this example both files came out with the same hash so they are truly identical.