I'm using ECS containers with CodeBuild and CodeDeploy (for the blue/green deployment) stages within a CodePipeline. I got an appspec.yml file in the root of my application code with task definition arn and the container name. All this is working good in one-environment-scenario. In my case when I have a separate AWS account for Dev, Test, and Prod, I need CodeDeploy to swap task definition arn based on environment context. Is there a way to pass parameters and modify the appspec.yml file something like we have for buildspec.yml file and custom environment variables? If not, what would be the best solution in a cross-account deployment using appspec.yml file?
UPDATE
Kudos to Ronan Cunningham for the Python script--see his code example below--which allows to generate the appspec.json file as an artifact of the CodeBuild stage and pass it down as input to CodeDeploy stage. You can call the script from the buildspec.yml file and pass custom environment variables as script parameters which will define your appspec.json based on AWS environment context. The script should be placed in the root of the app along the buildspec.yml and Dockerfile.
create_appspec_json.py script:
#!/usr/bin/python
import json
from sys import argv
def print_obj_to_disk(obj, file_type):
if file_type == 'app_spec':
file_name = 'appspec.json'
if file_type == 'task_def':
file_name = 'taskdef.json'
print('Writing {}:'.format(file_name))
print(obj)
print(json.dumps(obj, indent=2))
with open(file_name, 'w') as outfile:
json.dump(obj, outfile, indent=2)
def return_appspec(container_name, container_port, task_definition_arn):
appsec_obj = {
"version": 0.0,
"Resources": [
{
"TargetService": {
"Type": "AWS::ECS::Service",
"Properties": {
"TaskDefinition": task_definition_arn,
"LoadBalancerInfo": {
"ContainerName": container_name,
"ContainerPort": container_port
},
}
}
}
]
}
return appsec_obj
appspec_obj = return_appspec(argv[1], argv[2], argv[3])
print_obj_to_disk(appspec_obj, 'app_spec')
Call the script from buildspec.yml and pass the environment variables as parameters.
artifacts:
files:
- appspec.json
phases:
install:
commands:
- pip install boto3
build:
commands:
- python create_appspec_json.py $CONTAINER_NAME $CONTAINER_PORT $TASK_DEFINITION_ARN
...
post_build:
commands:
...
pre_build:
commands:
...
version: 0.2
my approach (on what sounds like the same pipeline setup) was to have a stage with a CodeBuild action before the stage with the CodeDeployToECS action. The job of the former was to generate the appspec and task definition programmatically with it's OutputArtifact an InputArtifact to the latter.
Any required params are passed down to the CodeBuild project via the pipeline action Configuration.
Updated:
The basic approach is something like the following:
The codebuild project before codedeploy runs a python script to generate the appspec and task definition, looking up values as required and writing them to disk as json files. This is not the full script, just the app_spec portion as an example.
This project is in the 'PrepareCodeDeploy' stage in the sample section of the pipeline definition below. The json files are stored as pipeline artifacts and then become inputs to the next stage.
Hopefully that will give you some ideas as to how you might get it working in a way that suits you. I've just decided to write a blog post on my approach that while not perfect, does work and follows a path to create a pipeline and all the required components in a linear fashion. The official documentation seems to go off in a lot of directions and I've seen a lot of people struggle with this.
Any further questions, just ask.