Updated on Januar 10, 2018
Deploy static websites using GitLab CI
One of my favorite tools of 2017 is GitLab. I guess my enthusiasm about it even helped convince my company to switch over to GitLab. In this post I am describing how to automize the deployment of mlp.skofgar.ch using a GitLab CI Pipeline Job.
GitLab offers an one-stop solution for almost anything™ that we might need as a software developer. There’s a ton of bells and whistles from issue tracking to continuous integration (CI), continuous deployment (CD), monitoring. GitLab could easily replace Jenkins, Jira, GitHub, wiki systems, bug trackers and deployment monitoring services. It even integrates with Kubernetes.
Enough with praises – I recently cleaned up my neural network project that I worked with Camila on and wanted to facilitate the deployment process. However, since I don’t run a large container management system and don’t control all the details of the server I was going to put it on (my website is hosted by Cyon), I needed a different way.
GitLab CI / Stages / Jobs
GitLab CI is extremely versatile, basically anything that we can do in a terminal / bash script, should be supported. So if I can build a bash script on my computer that would deploy my website, then there will be a way to do it in GitLab CI.
GitLab CI allows us to define multiple stages that can execute multiple jobs in parallel. Let’s review the following pipeline example from the GitLab CE project. It has 5 stages: Build, Prepare, Test, Post-test, Post-cleanup that each have multiple jobs:
My project is fairly simple. I only needed one stage, with one job. I created the job deploy_production that uses the deploy-stage. Please note that if we don’t defined any stages GitLab will use
deploy by default.
deploy_production: stage: deploy image: phusion/baseimage before_script: - 'which ssh-agent || ( apt-get update -y && apt-get install openssh-client -y )' - apt-get update -y && apt-get install rsync - mkdir -p ~/.ssh - eval $(ssh-agent -s) - '[[ -f /.dockerenv ]] && echo -e "Host *\n\tStrictHostKeyChecking no\n\n" > ~/.ssh/config' - ssh-add <(echo "$PRODUCTION_PRIVATE_KEY") - echo "$PRODUCTION_PRIVATE_KEY" > ~/.ssh/id_rsa script: - rsync -avuz -e 'ssh -i ~/.ssh/id_rsa' /builds/skofgar/jomis-neuralnetwork/* $PRODUCTION_SERVER artifacts: paths: - /builds/skofgar/jomis-neuralnetwork/ environment: name: production url: https://mlp.skofgar.ch when: manual only: - master
As image I selected phusion/baseimage which is a lightweight version of Ubuntu, specialized for containerization. The website won’t actually be deployed in the phusion/baseimage, but it will be used to run our scripts.
before_script I install all the necessary dependencies
openssh. It also configures my authentication method – I chose to use an SSH key.
In order for GitLab to manage where will be deployed to, we should define the environment. This will allow GitLab to know under which url the deployment will be, what type it is (staging, training, production, …), it will also allow us to rollback and re-deploy our system.
Authentication / Secrets
How can we securely deploy from GitLab CI, since .gitlab-ci.yml is in source control we don’t want to put our secrets there?! Fortunately GitLab let’s us define secret variables. This could be as simple as username and password, or in my example a to GitLab dedicated SSH private key.
I preferred using my SSH key, because I’m not exposing my password and rely on a more complex authentication. (Review discussion on stackexchange)
In my code snippet above you’ll notice
$PRODUCTION_SERVER. These two secrets are defined in GitLab’s CI/CD settings page.
Sync GitLab and webserver folders
The actual synchronization is very simple. I basically tell GitLab CI to take all my static folders and synchronize them to my server:
rsync -avuz -e 'ssh -i ~/.ssh/id_rsa' /builds/skofgar/jomis-neuralnetwork/* $PRODUCTION_SERVER
I’m telling the server to use the
.ssh/id_rsa that I defined earlier and synchronize my
jomis-neuralnetwork folder with my
$PRODUCTION_SERVER. The secret contains username, server-name and destination folder and is formatted as follows:
To add a few more bells and whistles I further defined, that each version of the site should be stored as an artifacts. Please note that we can also expire them.
Let me know if this description is helpful, if you have more questions or what your experience was.