Correct approach in deploying stack to Docker for AWS

215 views Asked by At

I am trying to deploy my docker-compose based stack to Docker for AWS (created via AWS CloudFormation).

My compose YAML file is managed in Git repository and Docker images in private registry (Gitlab).

What is the correct way of working with Manager to deploy a service?

I tried (and failed) several approaches:

  1. Working with local Docker client via Docker API is not possible, because Docker for AWS manager node is not opening 2375 port.
  2. Rsyncing compose YAML and environment file directly to manager node is not possible, because rsync is not installed on Amazon Docker AMI.
  3. curl the file from Gitlab seems like a very inconvenient way of doing it.

Thanks

2

There are 2 answers

0
Meir Tseitlin On

Found a way to do it more or less properly (according to some comment in Swarm documentation):

Create SSH tunnel to manager: $ ssh -NL localhost:2374:/var/run/docker.sock docker@<manager ip> &

Run everything locally with $ docker -H localhost:2374 info

or define export DOCKER_HOST=localhost:2374

and use docker as you if you are running on Swarm manager $ docker info

0
Abhishek Galoda On

In my opinion, there are 2 options you can try

  1. Use Jenkins and then there is a plugin called publish over SSH. You can use this plugin to send your compose file and then run commands like "docker stack deploy".More description can be found here
  2. You can use Docker cloud to bring your swarm to your local terminal, similar to what you have already done . Follow this link

The first approach is much better because you have automated the deployment, now you can schedule deployments, run it on click of a button or even on commits.

Hope it helps !!