I am rsyncing a few directories. I have a bash terminal open and am executing something like this:
for DIR in * ; do rsync -a $DIR example.com:somewhere/ ; done
However if I want to stop the whole things, I press Control-C. That stops the rsync, but then it keeps going to the next one. In this case I realize what has happened and then just press Control-C like a madman until things work again.
Is there some way to 'fix' this. I want it so if I have a loop like that, and press Control-C, that it will return me to my bash shell.
This will also exit the loop if an individual rsync run fails for some reason.
To expand on Dennis' answer, your code might look like:
For a working example (that happens to involve rsync), check out http://gist.github.com/279849.
You can set a trap for Control-C.
will execute the
command
when Control-C is pressed. Just put thetrap
statement somewhere in your script at a point where you want it to become effective.Ctrl-Z
to suspend the script ;kill %%
Credits, explanations and more details in this answer.
When you put a string of commands inside parentheses, the string will act as a single process, and will receive the SIGINT and terminate when you hit Ctrl-C:
But! In the case of the
rsync
command, it allows multiple sources, so the code you wrote would be better-written as:I tend to put another command in my loop that can easily be interrupted. It requires two ctrl-C's to be pressed.
It's not such a great solution for this rsync, which you probably want to run quickly. But it does work well for other loops, like this one:
This loop will re-lookup the address of example.com every time through the loop, which is useful if you're watching for a DNS change.