I know for ec2.py
I can either specify environment variables via export
before calling ec2.py
or use a boto
config file with plain-text passwords (or python keyring).
As I have the aws key and secret in ansible vault anyway, is there a way to auto-export this from the vault or any other means to pass the value to ec2.py
instead of having to specify it again?
Well you could write a simple task to dump the keys from
vault
into theboto3
configuration.credentials.j2
Where
aws_access_key_id
andaws_secret_access_key
could be stored in a vault.The task would than need to be run against the Ansible control host (the host that executes
ansible-playbook
).The keys would than be unencrypted on the Ansible control host. IMHO (I could be wrong here) you need to supply plain AWS keys to boto either via environment variables (
export
command) or via boto configuration.Ansible makes API calls to AWS via boto. Boto is not part of Ansible. So there is no native way to use parameters defined in Ansible in boto. That functionality would have to be part of boto.
As you mentioned you're running Ansible on a EC2 instance you should actually don't use credentials but roles attached to the EC2 instance: http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/iam-roles-for-amazon-ec2.html
The idea is that your instance itself is able to get temporary credentials and can execute the necessary commands which are defined in this role. As you never store any credentials anywhere this is the most secure way to work with the AWS API from an EC2 instance. As Ansible relies on boto this will work out of the box - you just need to create a role which has all the necessary IAM permissions and attach it to your instance you're running Ansible on. After that your dynamic inventory will work without needing any additional credentials.