I have a large folder that I am replicating by dfs and I want to check that all files have been replicated correctly.
Currently I am running the following script at both ends.
cd e:\data\shared\
dir /a:-h /b /s > e:\data\shared\result.txt
and then using a text editor to tidy the file before using a diff tool to compare them.
Does anyone know a better way of doing this? Failing that does anyone know how to adapt my script to ignore all the files in the DfsrPrivate folders
Why not use the built-in DFS diagnostic reporting?
I've setup scheduled email reports using a script similiar to the one here in the past.
edit: Since Brent mentioned the powershell angle, there is also this script out there. I haven't used it and I don't have a DFS environment in my test environment to see what this does, but it may be worth looking at.
Is there actually a concern that the files are not replicating?
When we setup DFS with about 30GB trans-atlantic we didn't go through and check each file, instead we just sampled the content through our tests to ensure that the system behaved the way we wanted it to.
Essentially we took a single folder, around 1GB of content (mostly PDF) and replicated it. Once we confirmed our smaller subset of data replicated it was pretty comfortable to replicate the rest without any major issue.
It would have taken forever to try and diff the contents, let alone execute the command to begin with against close to 5 million objects.