To build on @SvenW's answer (which will only work on the current directory), if you have a HUGE number of files or want to do it on a recursive directory structure you can also use
find . -type f -exec gzip \{\} \;
and if you need to put the output into a different directory (in this example, ../target) and don't want to remove the originals, you can do something like:
find . -type f -print | while read fname ; do
mkdir -p "../target/`dirname \"$fname\"`"
gzip -c "$fname" > "../target/$fname.gz"
done
Putting every file into a separate tar file doesn't make any sense in this scenario. You can use
gzip
to compress them directly:will result in
file1.out.gz
,file2.out.gz
etc.You would use
tar
only if you would need a compressed archive as a single file.If you ineed need a tar archive for every file, you can create it like so:
To build on @SvenW's answer (which will only work on the current directory), if you have a HUGE number of files or want to do it on a recursive directory structure you can also use
and if you need to put the output into a different directory (in this example,
../target
) and don't want to remove the originals, you can do something like:Try this one: