I was able to setup integration between github and AWS CodePipeline, so now my code is uploaded to S3 after a push event by a lambda function. That works very well.
A new ZIP with source code on S3 trigger a pipeline, which builds the code. That's fine. Now I'd like to also build a docker image for the project.
The first problem is that you can't mix a project (nodejs) build and docker build. That's fine, makes sense. Next issue is that you can't have another buildspec.yml for the docker build. You have specify the build commands manually, ok, that works as a workaround.
The biggest problem though, or lack of my understanding, is how to put the docker build as part of the pipeline? First build step build the project, the the next build step builds the docker image. Two standalone AWS CodeBuilds.
The thing is that a pipeline build step have to produce an artifact on the output. But a docker build doesn't produce any files and it looks that the final docker push after docker build is not qualified as an artifact by the pipeline service.
Is there a way how to do it?
Thanks
Sorry about the inconvenience. Making it less restrictive is in our roadmap. Meanwhile, in order to use CodeBuild action, you can use a dummy file as the output artifact.