the idea is you hash all the files cut out the hashes one per line, sort them and hash that yielding a single hash. this doesn't depend on the names of the files.
You could create MD5 sums of every single files, order these checksums alphabetically and has them (with or without newlines). Since MD5 is cryptographic, it should work just fine with hashes of hashes.
There should be a certain order to things, otherwise you will get different results for equal dirs.
And you should consider that adding some file to one dir will completely change the result, even if it was just a .directory of .DS_Store file.
As a specific case, lets say you want to copy some files from directory1 to directory2 and then you want to verify a successful copy using an md5 comparison.
First. cd to directory1 and type:
find -type f -exec md5sum "{}" \; > ~/Desktop/md5sum.txt
which will create a reference file containing an md5 sum for each file in directory1. Once this is done, all you have to do is cd to directory2 and type:
md5sum -c ~/Desktop/md5sum.txt
The program md5sum fetches each path from the md5sum.txt file, computes the md5sum of that file in the destination folder and then compares it with the sum it has stored in the file.
After the process is complete, you will get a summary such as 'So and so many files didn't match up' or something like that.
I've had a need for verifying integrity of backups/mirrors which contain a large number of files and ended up writing a command-line program called MassHash. It's written in Python. A GTK+ Launcher is also available. You may want to check it out...
This lists all files and directories and gets md5sum for each. Then gets md5sum for everything.
Tricky bit solved here that md5sum is not capable to do the sum for a directory, but it tells this to us: md5sum: dir/sub_dir: Is a directory. We just move this message to a standard output.
Sure -
md5sum directory/*
If you need something a little more flexible (say, for directory recursion or hash comparison), try md5deep.
To compare a directory structure, you can give it a list of hashes to compare against:
This will output all of the files in directory2 that do not match to directory1.
This will not show files that have been removed from directory1 or files that have been added to directory2.
If you'd like to see what's different (if anything) between two directories, rsync would be a good fit.
This will list any files that are different.
i think i answered this one before with this answer:
gives:
b1a5b654afee985d5daccd42d41e19b2877d66b1
the idea is you hash all the files cut out the hashes one per line, sort them and hash that yielding a single hash. this doesn't depend on the names of the files.
The cfv application is quite useful, not only it can check and create MD5 checksums, it can also do CRC32, sha1, torrent, par, par2.
to create a CRC32 checksum file for all files in current directory:
to create a MD5 checksum file for all files in current directory:
To create a separate checksum file for each sub directory:
To create a "super" checksum file containing files in all sub directories:
I used hashdeep, as explained in this askubuntu answer: Check the correctness of copied files:
To calculate the checksums:
To verify and list the differences:
This has an advantage over md5deep in that it will show renamed (moved), added, and removed files, as well as avoiding the problem with 0 length files pointed out at the bottom of http://www.meridiandiscovery.com/how-to/validating-copy-results-using-md5deep.
This worked for me: (run it while in the directory you are interested in)
You could create MD5 sums of every single files, order these checksums alphabetically and has them (with or without newlines). Since MD5 is cryptographic, it should work just fine with hashes of hashes.
There should be a certain order to things, otherwise you will get different results for equal dirs.
And you should consider that adding some file to one dir will completely change the result, even if it was just a
.directory
of.DS_Store
file.As a specific case, lets say you want to copy some files from directory1 to directory2 and then you want to verify a successful copy using an md5 comparison.
First. cd to directory1 and type:
which will create a reference file containing an md5 sum for each file in directory1. Once this is done, all you have to do is cd to directory2 and type:
The program md5sum fetches each path from the md5sum.txt file, computes the md5sum of that file in the destination folder and then compares it with the sum it has stored in the file.
After the process is complete, you will get a summary such as 'So and so many files didn't match up' or something like that.
I've had a need for verifying integrity of backups/mirrors which contain a large number of files and ended up writing a command-line program called MassHash. It's written in Python. A GTK+ Launcher is also available. You may want to check it out...
http://code.google.com/p/masshash/
One-liner:
This lists all files and directories and gets
md5sum
for each. Then getsmd5sum
for everything.Tricky bit solved here that
md5sum
is not capable to do the sum for a directory, but it tells this to us:md5sum: dir/sub_dir: Is a directory
. We just move this message to a standard output.