What's the best way to deploy dozens of resources such as CloudFormation templates, Stack Sets, and Lambda functions using Code Pipeline?
In AWS I have a multi-account architecture running an AWS Organization. I want a pipeline running in a single account. That pipeline will deploy CloudFormation templates to one or more accounts within the Organization.
The options I've found so far are:
Have a pipeline stage or action for each source file. This works quite well, but means every time you add a source file you need to modify your pipeline, which seems like overhead that could be automated or eliminated. You can't deploy StackSets with this approach. You also need a stage per template per account to deploy to, so it's impractical.
Use nested stacks. The problems with this are 1) Within the master stack I don't know what naming convention to use to call the other stacks direct from CodeCommit. I could work around that by having CodeBuild copy all the files to S3, but it seems inelegant. 2) Nested stacks are more difficult to debug, as they're torn down and deleted if they fail, so it's difficult to find the cause of the problem
Have CodeBuild to run a bash script that deploys all the templates using the AWS CLI.
Have CodeBuild run an Ansible playbook to deploy all the templates.
Have Lambda deploy each template, after being invoked by CodePipeline. This is likely not a great option as each invocation of Lambda would be for a single template, and there wouldn't be information about which account to deploy to. A single Lambda function that does all the deployments might be an option.
Ideally I'd like to have CodePipeline deploy every file with specific extensions in a CodeCommit repo, or even better deploy what's listed in a manifest file. However I don't think this is possible.
I'd prefer to avoid any technologies or services that aren't necessary. I would also prefer not to use Jenkins, Ansible, Teraform, etc, as this script could be deployed at multiple customer sites and I don't want to force any third party technology on them. If I have to use third party I'd rather have something that can run in a CodeBuild container than have to run on an instance like Jenkins.
--
Experience since I asked this question
Having to write Borne Shell (sh) scripts in CodeBuild is complex, painful and slow.
There needs to be some logic around creation or update of StackSets. If you simply call "create stackset" it will fail on update.
There's a reason the AWS Landing Zone pipeline is complex, using things like step functions.
If there was an easy way to write logic such as "if this stackset exists then update it" things would be a lot simpler. The ASW CDK is one possible solution to this, as it lets you create AWS infrastructure using Java, .Net, JavaScript, or TypeScript. Third party tools such as Teraform and such may also make help, but I don't know enough about them to comment.
I'm going to leave this question open in case someone comes up with a great answer.
--
Information from AWS Support
AWS have given the following advice (I've paraphrased it, filtered through my understanding, any errors are my own rather than incorrect advice from AWS):
CodePipeline can only deploy one artifact (eg CloudFormation template) per action
CodePipeline cannot directly deploy a StackSet, which would allow for deployment of templates across accounts. StackSets can be deployed by calling CodeBuild / Lambda.
CodePipeline can deploy to other accounts by specifying a role in that other account. This only deploys to one account at a time, so you would need one action per template per account
CodeBuild started as part of a CodePipeline running in a container gives more flexibility, you can do whatever you like here really
CodePipeline can start Lambda, which is very flexible. If you start Lambda from a CodePipeline action you get the URL of a single resource, which may be limiting. (My guess) You can probably invoke Lambda in a way that lets it do the whole deployment.
I would look at deploying all the templates through a single Ansible playbook. In the
playbook.yml
you can have many tasks, one per CFN template, give each template the required parameters, feed outputs from one stack to the next, etc. Also Ansible is idempotent so when re-running the playbook it (re-)deploys only what's modified.This can all be a single step in CodePipeline.
Now how to actually run it? CodePipeline can execute CodeBuild, CodeDeploy, ECS Task or Elastic Beanstalk. I would probably choose CodeBuild with an Ansible docker image. Why don't you want to use CodeBuild?
If you really really want to do CodePipeline deployment through the CloudFormation method you can probably create some custom resource that executes the ansible playbook, but that seems quite convoluted.
My choice would be CodePipeline ➜ CodeBuild ➜ Ansible playbook ➜ deploy lots of CloudFormation stacks.
BTW To debug nested templates failures you can always change the Filter in the console to Failed or Deleted and examine the failed stacks events there. When they are deleted they only disappear from the default view but the details are still there.
However I don't like complex nested templates, I find them harder to manage and update than using Ansible.
Hope that helps :)