I have what seems to be a growing count of EC2 instances and all is running fine and dandy. The one problem I'm facing, however, is figuring out a strategy for SSHing between the machines. Copying over my private key to each instance is counter productive, and it works fine when I need to SSH in from my personal machine, but not when I need to SSH from machine to machine.
What are some decent strategies to tackle this problem? How are you SSHing in between your cluster of EC2 instances?
You use
ssh-agent
:For easier use, add
to your
~/.ssh/config
Well given all my EC2 instances are started using the same SSH key for root I only need to load one key into the SSH agent. That said I don't typically SSH in as root as a matter or policy, so my EC2 instances fire up and connect to a Puppet server which then configures them including installing any applications along with setting up user accounts and their respective SSH identity keys and establishing sudo permissions. Then I just load my personal user private SSH identity into ssh-agent on my laptop from it's LUKS encrypted USB drive. The root SSH identity key put in place during the EC2 instance initialization is then just there as a backup.
The other advantage of using Puppet is I can use the puppet server as a jump box into any of the instances and actually have Puppet update the system's ssh_known_hosts and /etc/hosts file automatically for me.