Scenario:
I have been working on implementing a concourse ci pipeline for over a month now and my single yml
file has grown quiet a bit. I understand that it is best practice to breakup the pipeline into several files and reference them in your pipeline.
Question:
Can someone please provide what the best practice is to structuring your concourse ci pipeline?
My thought process:
offering-pipeline
|
|_ ci:
| |
| |_ images:
| | |_ Dockerfile
| |
| |_ misc:
| | |_ python-requirements.txt
| |
| |_ ci-pipeline.yml
|
|_ project:
|_ project-pipeline.yml
|
|_ jobs
|
|_ scripts :
|
|_ build:
| |_ build_xyz.
|
|_ deploy:
| |_ deploy_xyz.
|
|_ test:
| |_ test_xyz.
|
|_ publish:
|_ publish_xyz.
Thanks,
-Abe.
A First step would be to extract all tasks to files. I have a tasks folder, a templates folder, and a script folder for each pipeline. Above these (in the root) I have a pipeline.yml containing the pipeline root structure, and a Makefile and Makefile.constants for setting up the pipeline in concourse..
Since I dont have that many build,test,publish tasks I have a naming convention on them instead of loads of folders with 1-4 files in each.
The tree inside my pipeline folder in atom:
Note: the pipeline.yml file is still pretty long (~500 lines)
The Makefile, the ${} comes from the included contant-files: