This is all running within AWX which is hosted on-prem. I'm trying to manage some EC2 instances within AWS. I've setup the bastion jump and can get all my other plays to work correctly.
However there is one simple job template I want to provide to a few devs. Essentially when they make a change to the code, it enables opcache to be cleared and invalidates the specific files in CloudFront.
I want the CloudFront API Call (cloudfront_invalidations module) to run off AWX locally and then if this is successful, notify the two web servers instances to restart their PHP and Apache process.
---
- name: Restart httpd and php-fpm
remote_user: ec2-user
hosts: all
become: true
tasks:
- name: Invalidate paths in CloudFront
cloudfront_invalidation:
distribution_id: "{{ distribution_id }}"
aws_access_key: "{{ aws_access_key }}"
aws_secret_key: "{{ aws_secret_key }}"
target_paths: "{{ cloudfront_invalidations.split('\n') }}"
delegate_to: 127.0.0.1
notify:
- Restart service httpd
- Restart service php-fpm
handlers:
- name: Restart service httpd
service:
name: httpd
state: restarted
- name: Restart service php-fpm
service:
name: php-fpm
state: restarted
However when running the play it seems to ignore the 'delegate_to' action and instead runs the invalidation twice, for each host. I'm unsure if it's actually running locally. I've tried adding the run_once flag, but this only then restarted httpd + PHP on one host.
Any ideas?
All tasks are applied to all hosts, even when you delegate it to localhost. If you have 2 hosts, then the tasks are running twice on localhost. You can use
run_once
to apply the task only to the first host in "all".See https://docs.ansible.com/ansible/latest/user_guide/playbooks_strategies.html#running-on-a-single-machine-with-run-once for more details.