The Right Code

Home of Greg Bergé. Let's speak about JavaScript.

Use Docker + Jenkins to run GitHub tests

Why not Travis?

Travis Mascot

Nowadays, having a continous integration platform is a requirement. If you use GitHub, you probably know Travis. Travis is one of the best CI platform. It's free for open source, easy to set-up, for small open sources module, this is definitely my favorite platform.

If you are working on a big project, you probably need to control your test environment to ensure that it's consistent with production. To do that, Travis is probably not the best tool since it has no local storage and it's not free.

Jenkins flaws

Jenkins Mascot

Jenkins is probably the most flexible and configurable tool to run CI test. You can do everything with it.

The biggest problem with Jenkins is the lack of isolation. All tests will run in the same environment. For an example if you need a database in a test, you will need to install it on your Jenkins server. The problem is that it will be shared with all of your jobs, there can be collision problems and you will need to restore database after each test.

Docker to the rescue!

Docker logo

Jenkins don't provide natively a new dedicated environement for every test. Why don't run each test in a virtual machine?

Virtual machines are heavy and hard to maintains. It's a reality.

However, we are in 2014, and there is Docker. Docker is exactly the right tool for these needs, it's lighter than a Virtual Machine thanks to LXC and it's stable. With Docker, it will be possibleto run each test in a new dedicated environement. It will permits use to avoid problems of collision and ensure consistence between the environments.

Jenkins + GitHub

The first thing to do is to set up links between Jenkins and GitHub. To do that, you must install the GitHub plugin for Jenkins.

SSH key

For SSH, the simplest solution is to generate a key for the "jenkins" user and to add it to your account. If you work in an organization you can also create a bot account that have admin rights on your GitHub projects and add the ssh key on it.

Global configuration

First you must configure your GitHub API token to let Jenkins manage hooks on your projects.

Jenkins GitHub Web Hook

Job set-up

First you must fill the field "GitHub project" with the address of your GitHub project.

Jenkins GitHub project

Then you will need to set up the git repository. Just fill the repository url, and let other fields like that.

Jenkins git repository

To trigger the build, the optimal way is using GitHub web hook. To do that, your Jenkins must be public and you must have checked the box "Build when a change is pushed to GitHub".

Jenkins GitHub trigger build

The last thing to do is to tell Jenkins to set up build status on GitHub commits. In your job set-up, you must add an "Action after build" called "Set build status on GitHub commit".

Jenkins set build status on GitHub commit

It will permits you to see the status of the pull-request directly in GitHub.

GitHub pull-request status

Set up Docker

First you must install Docker on your Jenkins server.

To avoid running Docker in sudo mode, you must add the "jenkins" user in the "docker" group.

sudo gpasswd -a $USER docker  
sudo service docker restart  

When it's done, you should be able to run docker ps without sudo. If it doesn't work try to logout of your session. It's also adviced to restart Jenkins.

Run tests

Let's try to write the command that will run tests in a new docker container. Let's imagine that you are setting up tests of a GitHub project that required a Postgres database.

Postgres container

First we need to run a container with postgres running in it. We will use the official Postgres image called "postgres", you don't need to pull it, if the image is not present on your server, Docker will automatically download it from the Docker registry.

docker run -d postgres  

We have now a container running Postgres. The container run in a detached mode (-d) and is completely isolated (no port exposed).

Tests container

Then we will need to start the tests of your project.

APP_DIR="/usr/src/app"  
docker run -v $WORKSPACE:$APP_DIR -w $APP_DIR node:0.10.28 bash -c 'npm install && npm test'  

This line is a little complicated, I will try to explain it:

  • -v $WORKSPACE:$APP_DIR: we mount the current directory (using jenkins env variable $WORKSPACE) to a directory "/usr/src/app" in the container.
  • -w $APP_DIR: we define the current working dir as the application directory previously mounted.
  • node:0.10:28: the official node image v0.10.28.
  • bash -c 'npm install && npm test': run the bash command "npm install && npm test". We need to use "bash -c" to be able to use "&&".

Linking containers together

Actually, the tests can't communicate with the database. To make it happens you must link your database to your tests container.

DB_NAME="$BUILD_TAG-db"  
docker run -d --name DB_NAME postgres  
APP_DIR="/usr/src/app"  
docker run -v $WORKSPACE:$APP_DIR -w $APP_DIR --link DB_NAME:database node:0.10.28 bash -c 'npm install && npm test'  

We associate a name to our database container, then we link it to the test container. In the test container the database will have the name of "database", so to use database in your tests, the host of database must be "database". To know more about Docker container linking, you can read this great example.

Fine that's it, the two containers are associated and tests can run. You might still have some problems.

Postgres is not started

It's possible that docker will be too fast, you will probably have to wait database to be ready. I solve it by adding a sleep 1 after the run command. It's not very elegant but it works for me.

DB_NAME="$BUILD_TAG-db"  
docker run -d --name DB_NAME postgres  
sleep 1  
APP_DIR="/usr/src/app"  
docker run -v $WORKSPACE:$APP_DIR -w $APP_DIR --link DB_NAME:database node:0.10.28 bash -c 'npm install && npm test'  

I can't install private node_modules

To be able to install private node_modules or other things that needs an SSH key you will need to forward your SSH key in the container. It's pretty easy to do, you just have to mount the directory (-v ~/.ssh:/root/.ssh and run a chmod command (chmod 700 /root/.ssh/id_rsa) to ensure that the right are correctly setted up.

DB_NAME="$BUILD_TAG-db"  
docker run -d --name DB_NAME postgres  
sleep 1  
APP_DIR="/usr/src/app"  
docker run -v ~/.ssh:/root/.ssh -v $WORKSPACE:$APP_DIR -w $APP_DIR --link DB_NAME:database node:0.10.28 bash -c 'chmod 700 /root/.ssh/id_rsa && npm install && npm test'  

I can't wipeout the workspace on Jenkins

When you run a docker container, the default user is "root", so if you mount a directory, every files written in your Docker container will have be owned by "root". This is why you can't wipeout the workspace with the "jenkins" user.

To solve that, there is several solution, the simpler is to run a chmod -R 777 node_modules after the npm install. If you do that, the final script will be:

DB_NAME="$BUILD_TAG-db"  
docker run -d --name DB_NAME postgres  
sleep 1  
APP_DIR="/usr/src/app"  
docker run -v ~/.ssh:/root/.ssh -v $WORKSPACE:$APP_DIR -w $APP_DIR --link DB_NAME:database node:0.10.28 bash -c 'chmod 700 /root/.ssh/id_rsa && npm install && chmod -R 777 node_modules && npm test'  

The second one, more complicated is to create a custom image of node that have a user "jenkins". After you will be able to run the command with the user "jenkins" (--user jenkins). You can also put the ssh key directly in the VM. The final script will be:

DB_NAME="$BUILD_TAG-db"  
docker run -d --name DB_NAME postgres  
sleep 1  
APP_DIR="/usr/src/app"  
docker run --user jenkins -v $WORKSPACE:$APP_DIR -w $APP_DIR --link DB_NAME:database custom-node:0.10.28 bash -c 'npm install && npm test'  

Remove containers

Actually your containers will remain started after the job is complete. Usually Jenkins stop all processes at the end of the job thanks to its Process Tree Killer. But with containers it's not possible because they run in a detached mode.

To ensure that containers are removed even if the test fail, you must use the Post Build Task plugin.

The only thing to do after is to remove containers at the end of the build.

Remove Docker container at the end of the build

comments powered by Disqus