I have a big folder with hundreds of thousands of files. Total size is 10GB
The idea is that I want to archive this folder so I can transfer it to another server faster than file by file.
I tried with tar cfj archive.tar.bz2 dir-to-be-archived/
but it still seems to be slow.
I don't care about the final size because the network connection works up to 100mb/s in transfer.
I heard something about lzop
but I didn't understood exactly how to use it.
So is there anyway to create an archive really fast, like in 10-15 minutes?
If there are thousands of files in the folder, it will take some time to pack them all up, even without compression. If the target machine you are using has linux on it as well, you may be able to use rsync to transfer the whole folder with files as is (https://help.ubuntu.com/community/rsync).