I'm trying to tar up a directory that has about 3M tiny files in it. Tar is chugging along, but I'm thinking its going to take longer than I can wait.
I'm wondering if telling tar to not store metadata (owner, group, perms) would reduce the churn on reading and re-reading this huge directory and maybe speed things up, and if there is a tar switch that does this.
My initial perusal of the man page only gets me something like --no-xattrs, which looks like a start, but I was hoping someone had some specific knowledge.
I've got to think that the performance of the underlying filesystem probably has a lot to do with what you're getting out of tar.
What is the filesystem you're reading from, and how tiny are the files?
You might try adding mbuffer between your tar and your output file. I just learned about it, and it did wonders for my transfer.
e.g. tar -c dir | mbuffer > output.tar
Might not help if the slowness really is a a result of disk or filesystem limitations.