I am using Ansible + Vagrant to create my infrastructure or make a little simulation of I wish. It installing postgres and creating a ssh directory to store the differents keys for each host.
It is my project structure:
.
├── ansible.cfg
├── cluster_hosts
├── group_vars
│ ├── host_master
│ ├── host_pgpool
│ ├── host_slave1
│ └── postgresql
├── roles
│ ├── postgresql
│ │ ├── files
│ │ ├── handlers
│ │ └── tasks
│ │ └── main.yml
│ └── ssh_agent
│ └── tasks
│ └── main.yml
└── site.yml
It is the cluster_hosts declaration:
host_master ansible_ssh_host=192.168.1.10 ansible_ssh_user=vagrant ansible_ssh_pass=vagrant
host_slave1 ansible_ssh_host=192.168.1.11 ansible_ssh_user=vagrant ansible_ssh_pass=vagrant
host_slave2 ansible_ssh_host=192.168.1.12 ansible_ssh_user=vagrant ansible_ssh_pass=vagrant
host_pgpool ansible_ssh_host=192.168.1.13 ansible_ssh_user=vagrant ansible_ssh_pass=vagrant
[ssh]
host_master
host_pgpool
host_slave1
[pg_pool]
host_pgpool
[database]
host_master
host_pgpool
host_slave1
host_slave2
It is my group_vars files:
host_master
known_hosts:
- 192.168.1.11
- 192.168.1.12
host_pgpool
known_hosts:
- 192.168.1.11
- 192.168.1.12
host_slave1
known_hosts:
- 192.168.1.12
And here is my site.yml:
---
# The main playbook to deploy the cluster
# setup database
- hosts: database
sudo: True
tags:
- setup_db
roles:
- postgresql
# setup ssh
- hosts: all
sudo: True
tags:
- setup_ssh
roles:
- ssh_agent
And here is the ssh_agent role:
---
- name: Install sshpass
apt: name={{ item }} state=present
with_items:
- sshpass
- rsync
- name: Create ssh directory
sudo_user: postgres
command: mkdir -p /var/lib/postgresql/.ssh/ creates=/var/lib/postgresql/.ssh/
- name: Generate known hosts
sudo_user: postgres
shell: ssh-keyscan -t rsa {{ item }} >> /var/lib/postgresql/.ssh/known_hosts
with_items:
- "{{ known_hosts }}"
- name: Generate id_rsa key
sudo_user: postgres
command: ssh-keygen -t rsa -N "" -C "" -f /var/lib/postgresql/.ssh/id_rsa
- name: Add authorized_keys
command: sshpass -p postgres ssh-copy-id -i /var/lib/postgresql/.ssh/id_rsa.pub postgres@{{ item }}
sudo_user: postgres
with_items:
- "{{ known_hosts }}"
- name: Owner postgresql
command: chown postgres:postgres /var/lib/postgresql/.ssh/ -R
ok, Now when I run:
ansible-playbook -i cluster_hosts site.yml --tags setup_ssh
I get an error in the Generate known hosts task :
PLAY [all] ********************************************************************
GATHERING FACTS ***************************************************************
ok: [host_pgpool]
ok: [host_slave2]
ok: [host_slave1]
ok: [host_master]
TASK: [ssh_agent | Install sshpass] *******************************************
ok: [host_slave1] => (item=sshpass,rsync)
ok: [host_master] => (item=sshpass,rsync)
ok: [host_pgpool] => (item=sshpass,rsync)
ok: [host_slave2] => (item=sshpass,rsync)
TASK: [ssh_agent | Create ssh directory] **************************************
skipping: [host_master]
skipping: [host_slave2]
skipping: [host_slave1]
skipping: [host_pgpool]
TASK: [ssh_agent | Generate known hosts] **************************************
fatal: [host_slave1] => One or more undefined variables: 'known_hosts' is undefined
fatal: [host_master] => One or more undefined variables: 'known_hosts' is undefined
fatal: [host_slave2] => One or more undefined variables: 'known_hosts' is undefined
fatal: [host_pgpool] => One or more undefined variables: 'known_hosts' is undefined
FATAL: all hosts have already failed -- aborting
PLAY RECAP ********************************************************************
to retry, use: --limit @/home/robe/site.retry
host_master : ok=2 changed=0 unreachable=1 failed=0
host_pgpool : ok=2 changed=0 unreachable=1 failed=0
host_slave1 : ok=2 changed=0 unreachable=1 failed=0
host_slave2 : ok=2 changed=0 unreachable=1 failed=0
I don't understand why this error? if each variable is declared in group_vars (host_master, host_pgpool, host_slave1).
Is my yml syntax wrong? I think that maybe it is the problem but I see that it is right for me?
By default, ansible does not read all files in
group_vars/
; it only readsgroup_vars/all
(orgroup_vars/all.yml
; incidentally, I've found it more convenient to add the.yml
extension to vars files). You need to tell it to read the files you want usingvars_files
in yoursite.yml
, like this: