How do I compress every file in a directory into its own tar whilst preserving the name for each file?
i.e. file1.out file2.out
-->
file1.out.tar.gz file2.out.tar.gz
How can you extract only the target dir and not the complete dir tree?
tar cf /var/www_bak/site.tar /var/www/site
tar xf /var/www/site.tar -C /tmp
This will produce:
/tmp/var/www/site
How is it possible to avoid the whole dir tree to be created when the file is extracted?
What I want it to extract to:
/tmp/site
I currently have two CentOS servers. I need to know how and what the quickest way would be to "tar" up the images directory and SCP it over?
Is that the quickest way that I just suggested, because tarring is taking forever... I ran the command:
tar cvf imagesbackup.tar images
And I was going to just scp it over.
Let me know if there is a quicker way. I have remote/SSH access to both machines.
I run this from a cronjob:
tar -czvf /var/backups/svn.tgz /var/svn/*
That generates this on stderr:
tar: Removing leading `/' from member names
I would like to avoid this because it is not a real error (for me!). I want on stderr only things that I should worry about?
How can I kill that message?
I have the feeling that it is a matter of using the tar -C option but I am not sure and I don't know how.
Thanks for the help,
Dan
Usually after dumping a MySQL database with mysqldump
command I immediately tar/gzip the resultant file. I'm looking for a way to do this in one command:
So from this:
mysqldump dbname -u root -p > dbname.sql
tar czvf dbname.sql.tgz dbname.sql
rm dbname.sql
To something like this:
mysqldump dbname -u root -p > some wizardry > dbname.sql.tgz
Or even better (since I'm usually scp'ing the dump file to another server):
mysqldump dbname -u root -p > send dbname.sql.tgz to user@host
I'm running bash on debian.