I'm looking for a nice way to generate a backup tar.gz of what is going to be overridden by the extraction of another tar.gz.
tar -ztf patch.tar.gz | grep -v "/$" | tar -T- -zcvf backup.tar.gz
This works perfectly! However, sometimes, the new patch (patch.tar.gz) will not only contain files already existing in the HDD, but also may add new files which don't already exist. These files which don't exist will be impossible to backup, and the second tar will generate an error. The exit code is 2.
tar: folder/file.txt: Cannot stat: No such file or directory
tar: Exiting with failure status due to previous error
I'm looking for shell command which only checks for file existence (stdin) et send existing files on stdout. Does it exist? I would need something like this:
tar -ztf patch.tar.gz | grep -v "/$" | filterfileexist | tar -T- -zcvf backup.tar.gz
I need one shell command and not a shell script.
Obviously, I could implement this shell command in C myself, because it's very simple, but I hope to find something generic which will be found on other UNIX platforms.
Try use perl oneliner for this
like
Here are a couple of alternatives.
tar -ztf patch.tar.gz | grep -v "/$" | xargs -i sh -c 'test -f {} && echo {}' | tar -T- -zcvf backup.tar.gz
or
tar -ztf patch.tar.gz | grep -v "/$" | tar --ignore-failed-read -T- -zcvf backup.tar.gz
The second example will exit with status zero after an attempt to backup a file that does not exist (tested on Centos 6.5), which avoids the problem.
Use
find
withxargs
. Default behavior offind
is to print filename tostdout
if the file exists, non-existing files will have an error message printed tostderr
If there are directories, add
-type f
or-maxdepth=0
to prevent find from default recursion.The usual whitespace caveat applies. Safer version