I feel a little silly asking what would seem to be a Google-able question but I'm trying to script out a repetitive task of (1) ssh'ing into a remote server (2) running script.sh in my home directory, and (3) copy/pasting the output.
It would be great if I could write a script to do this work for me. I've written some bash scripts that scp
files from these machines but never one that ran scripts on these machines.
Is it possible to have a script on machine 1 and log in and execute script.sh on machine 2 through machine n, dumping the output on machine 1? If so, how?
That's an easy one:
ssh user@host command
command = elaborate with piping and such as needed.
Similar to Jeffrey's answer, but with a one-liner loop.
This lets you spawn each task backgrounded, so the whole command will just take a moment from your workstation. If you're using password-based authentication, you'll need to either:
So I think the real question here is how do you get the script from machine 1 to machine 2 so you can execute it, right? There are a number of ways to do that. You could use a script on machine 1 that first uses scp to transfer the scriptfile
script
to machine 2, and then runsssh machine2 script
.That works well if you have public key auth all set up consistently so you don't get prompted for passwords. If you need to type your password each time you log in to machine2, then you need to look at approaches using
expect
, as detailed in this stackoverflow posting. You should be able to use autoexpect to write the necessary expect script quickly.Other approaches include nfs-mounting a shared directory on all the hosts. In that scenario, you copy a scriptfile to that directory on machine1, and it's then available on machine2 and all other machines connected to that same nfs mount.
Finally this is a problem that configuration management tools like cfengine and puppet solve with some sort of file delivery mechanism. In that setup, client machines connect to a central server and download the scriptfile before executing it.