I am using the following code in my bitbucket-pipelines.yml
files to remotely deply code to a staging server.
image: php:7.1.1
pipelines:
default:
- step:
script:
# install ssh
- apt-get update && apt-get install -y openssh-client
# get the latest code
- ssh [email protected] -F ~/.ssh/config "cd /path/to/code && git pull"
# update composer
- ssh [email protected] -F ~/.ssh/config "cd /path/to/code && composer update --no-scripts"
# optimise files
- ssh [email protected] -F ~/.ssh/config "cd /path/to/code && php artisan optimize"
This all works, except that each time the pipeline is run, the ssh client is downloaded and installed everything (adding ~30 seconds to the build time). Is there way I can cache this step?
And how can I go about caching the apt-get
step?
For example, would something like this work (or what changes are needed to make the following work):
pipelines:
default:
- step:
caches:
- aptget
script:
- apt-get update && apt-get install -y openssh-client
definitions:
caches:
aptget: which ssh
This is a typical scenario where you should use your own Docker image instead of one of the ones provided by Atlassian. (Or search for a Docker image which provides exactly this.)
In your simple case, this Dockerfile should be enough:
Then, create a DockerHub account, publish the image and reference it in
bitbucket-pipelines.yml
.