My problem:
I want a docker image saved as an artifact in the Amazon EC2 Registry, built by packer (and ansible)
My limitations: The build needs to be triggered by Bitbucket Pipelines. Therefore the build steps need to be executed either in Bitbucket Pipelines itself or in an AWS EC2 instance/container.
This is because not all dev machines necessarily have the permissions/packages to build from their local environment. I only want these images to be built as a result of an automated CI process.
What I have tried:
Using Packer, I am able to build AMIs remotely. And I am able to build Docker images using Packer (built locally and pushed remotely to Amazon ECR).
However the Bitbucket Pipeline, which executes build steps within a docker container already, does not have access to the docker daemon process 'docker run'.
The error I recieve in Bitbucket Pipelines:
+ packer build ${BITBUCKET_CLONE_DIR}/build/pipelines_builder/template.json
docker output will be in this color.
==> docker: Creating a temporary directory for sharing data...
==> docker: Pulling Docker image: hashicorp/packer
docker: Using default tag: latest
docker: latest: Pulling from hashicorp/packer
docker: 88286f41530e: Pulling fs layer
...
...
docker: 08d16a84c1fe: Pull complete
docker: Digest: sha256:c093ddf4c346297598aaa13d3d12fe4e9d39267be51ae6e225c08af49ec67fc0
docker: Status: Downloaded newer image for hashicorp/packer:latest
==> docker: Starting docker container...
docker: Run command: docker run -v /root/.packer.d/tmp/packer-docker426823595:/packer-files -d -i -t hashicorp/packer /bin/bash
==> docker: Error running container: Docker exited with a non-zero exit status.
==> docker: Stderr: docker: Error response from daemon: authorization denied by plugin pipelines: Command not supported..
==> docker: See 'docker run --help'.
==> docker:
Build 'docker' errored: Error running container: Docker exited with a non-zero exit status.
Stderr: docker: Error response from daemon: authorization denied by plugin pipelines: Command not supported..
See 'docker run --help'.
==> Some builds didn't complete successfully and had errors:
--> docker: Error running container: Docker exited with a non-zero exit status.
Stderr: docker: Error response from daemon: authorization denied by plugin pipelines: Command not supported..
See 'docker run --help'.
==> Builds finished but no artifacts were created.
The following quote says it all (taken from link):
Other commands, such as docker run, are currently forbidden for security reasons on our shared build infrastructure.
So, I know why the following is happening. It is a limitation I am faced with. I am aware that I need to find an alternative.
A Possible Solution: The only solution I can think of at the moment is, a Bitbucket Pipeline using an image with terraform and ansible installed, containing the following:
ansible-local:
- terraform apply (spins up an instance/container from AMI with ansible and packer installed)
ansible-remote (to the instance mentioned above)
- clone devops repo with packer build script on it
- execute packer build command (build command depends on ansible, build creates ec2 container registry image)
ansible-local
- terraform destroy
Is the above solution a viable option? Are there alternatives? Can Packer not run commands and commit from a container running remotely in ECS?
My long term solution will be to only use bitbucket pipelines to trigger lambda functions in AWS, which will spin up containers in our EC2 Container Registry and perform the builds there. More control, and we can have devs trigger the lambda functions from their machines (with more bespoke dynamic variables).
My understanding for your problem to block you is, the bitbucket pipelines (normally call them
agents
) have no enough permission to do the job (terraform apply
,packer build
) to your AWS Account.Since bitbucket pipeline agents are running in Bitbucket cloud, not in your AWS account (which you can assign IAM role on them), you should create an account with IAM role (policies and permissions are list below) assign its AWS API keys (
AWS_ACCESS_KEY_ID
,AWS_SECRET_ACCESS_KEY
, and option ofAWS_SESSION_TOKEN
) as environment variables in your pipeline.You can refer this document on how to add your AWS credentials to Bitbucket Pipelines
https://confluence.atlassian.com/bitbucket/deploy-to-amazon-aws-875304040.html
With that, you can run packer or terraform commands without issues.
For minimum policies you need to assign to run
packer build
, refer this document:https://www.packer.io/docs/builders/amazon.html#using-an-iam-task-or-instance-role
For
terraform plan/apply
, you need to assign the most permission because terraform can take care of nearly all aws resources.Second, for your existing requirement, you only need to run packer and terraform commands, you needn't run docker command in bitbucket pipeline.
So the normal pipeline with above aws API environment, it should work directly.
You should be fine to run terraform command in image
hashicorp/terraform
as well.