Cloud agnostic continuous quality assurance

Everybody wants clean code. We have learned until now in our careers that bad code is a monster eating our project deadlines. The question this article is trying to answer is if it’s possible to set up a development process in a way that will by design prevent rotting of our source code.

​1.​ Introduction

Let’s just get one thing straight. There is always a point in a project where someone has to decide to do things quick and dirty and then, later on, pay the price. Which is ok from a tactical point of view. But it is possible to setup tools that will let us keep some code quality standards even in cases of quick and dirty code writing

The last thing I would like to point out is that every enthusiastic team starting a new project wants to make a clean and successful project. But as the project kicks off, there is always a lack of time to actually set up a quality assurance environment. It can be either because of lack of time, or lack of knowledge in the domain. There should always be, from the first line of code committed, quality metrics and checks as part of the development process.

Let us define some required toolset that will help us get a quick start. We will run all our environments as Docker containers. The community has already done hard work of preparing pre-baked components we will use. Also using docker is becoming mainstream, so I will use this opportunity to share a thing or two about how to use it.

We will be using the following tools throughout the article:

  • Gerrit
  • Jenkins
  • SonarQube

​2.​ Source code repository

There isn’t really any use case where one would not want their code stored in a source code repository. Even when developing alone, one would benefit from having source code versioned. The reasoning from my perspective is simple. I consider my development machine volatile, and to avoid the classic “the dog ate my homeworking” syndrome, it is preferable to push code to some remote location, either as a backup or as primary storage. The most commonly used technologies for code versioning are SVN, Git and Mercurial. We are going to focus on Git. Git is widely adopted, feature rich and scales very well for bit teams.

The official Git implementation offers the possibility to start a central Git server. This is nice as it helps for a quick start. The downside of it is that the default Web UI is limited. Luckily there are plenty of web-centric solutions that make managing git repositories a breeze. Tools I had the opportunity to play with are GitBlit and BitBucket. Since we are mainly interested in offering a review process, GitBlit will not support us there. On the other hand, BitBucket is an all in one solution but not free or open source. To contribute back some marketing for the Open Source world we are going to focus on Gerrit here. Gerrit integrates source code management with code reviews. Not to go too far ahead let us focus on source code repositories first.

Let us first analyze how version control can contribute to code quality. Version control saves the entire history of a project, every change made to every file, the person who contributed to a change and when, together with the comment. To steer your mind away from blaming culture and finger pointing to whoever contributed nastiness that you stumble upon. A positive culture will contribute far more. For example, you can identify bad code. If you know how to fix, fix it and teach author how to do better in the future.

​2.1.​ Gerrit setup

Let’s get practical. Assuming you have docker up and running on your machine. The Docker installation instructions on official site are far more than enough to get you started. Just a couple of hints:

  • On Windows [7- 10> docker is not called docker for windows it is called docker toolbox and it’s running inside a VM
  • Docker does not run natively on MacOS, it is running inside VM as well
  • Setup GitBash or similar Bash implementation for windows

I’m lucky enough to be running a Linux distribution on my workstation, so docker runs natively and Bash is already present by default. To test if an installation is working, issue the following command:

Expected output:

As mentioned before we will start Gerrit as a Docker container. We will be using docker compose to help us with the setup. Docker compose helps us define our docker setup declaratively. The first version of our setup can be fetched with the following command:

The project contains a docker-compose.yaml, an httpd.conf file containing the Apache configuration and a passwords file defining 3 users:

  • admin/admin
  • user1/user1
  • user2/user2

We can start the code management pipeline by running:

Docker-compose will use a definition of the container in the docker-compose file and start the entire cluster. The output should be as follows (excluding some downloading lines):

Each service runs in its own container. To see logs of the Gerrit server, run :

If we examine the logs, we can see the following line, which is indicating that Gerrit is ready:

Gerrit can now be accessed. You can open it on http://127.0.0.1:8080/ in your browser. A prompt from the browser will appear to enter username and password. Here we will use admin/admin as the credentials.

The first time opening Gerrit, you will land on the intro page. Pay attention to information in the screen and click “Skip intro” to continue. You are logged in as administrator. Per default, Gerrit runs in development mode. This will cover the requirements of this article. There are plenty of tutorials on how to set up a production-ready Docker/Gerrit setup. If there is interest I will prepare a follow-up article.

​2.2.​ Creating a project

Let’s create our first project. Click on “BROWSE” in the top menu and choose “Repository”. The “Create new” button will appear on the top right side. After clicking the “Create new” button, the following popup will appear:

I will name the project “hello-world” and click the “Create” button. Gerrit will automatically redirect you to the page where you can find information on how to clone a project and configure additional options. Let’s keep defaults and clone the project. Execute:

Inside the directory we will create a new file called UserRepository.java in src/main/java directory using following command:

Now we can commit the file and push it to the code repository.

You will be prompted for a username. You can supply admin as the username. Then go to Gerrit UI and click the icon on top right corner which will bring following dropdown menu:

Click on the “Settings” option. Then scroll down to HTTP credential section and click the “Generate new password” button. Note that using HTTP credentials is not the recommended method for authentication against a Git repository. For production setup please use ssh certificates.

Click “GENERATE NEW PASSWORD” to continue the push operation. The following message should appear in your console if the push is successful:

Let’s make another change in the file and repeat the process. We change line 12 to the following:

Following commands will commit and push the code:

​2.3.​ Source control example

To point out the importance of source control systems we will use an example. Let’s imagine we have a bug and we traced it down to our newly updated class and we are not sure who changed what and why. We can now easily see changes to files with git annotate and git blame commands. But better yet, if we use an IDE (Eclipse in this example), information will be even more readable. After importing a git project in Eclipse we open UserRepository.java. We right-click in the editor to get the “Context Menu” and go on the “Team” menu where we choose “Show Revision Information”.

Every line gives us more insights into the history and changes done to that line.

This technique is extremely helpful because every time an experienced developer opens some class and spots problematic code, he or she can directly teach the author on how to approach this problem in the future. Additionally, we can see which code change could introduce a bug in the code.

One hint! To close revision information, click the right mouse button on a line number column and under the “Revisions” menu choose the “Hide Revision Information” option:

This is just one tiny example of how the source code repository can be used to improve the code quality. In the following chapter, we are going to address a core review from the aspect of maintaining code quality.

Recommendation: Always use source code repository for any type and size of a project.

​3.​ Code review

Using source code repositories adds huge benefits to any project. It also provides a supporting infrastructure for maintaining code quality as shown before. The second process, from which the code quality can be controlled, is code review. Code reviews can help prevent bugs by having a second pair of eyes validate code, maintain coding standards (like the formatting), etc..

Almost any mature source code server has code review implementations via mechanisms like pull requests. Since we are focusing on the open source solutions, we are going to dive deeper into pull request mechanisms offered by Gerrit.

With our apache setup, we have created two additional users. We can log out as admin and log in as user1 to continue. The recommended way is to use a private browser window to test the new user.

A popup screen appears after login. We can just click the “Close” button on that screen and continue.

We repeat the same steps for user2. This is required because users in Gerrit are created only after the first login. Please log out as user user2 and log in as admin.

All permissions in Gerrit are configurable via groups, so we will create a group for the hello-world project owner. We go to “Browse” -> “Groups” and then we can click “CREATE NEW” button on the top right corner. We name the group “hello-world-owner”. We can then add the user1 user as a member to the group by entering data into the text field and clicking the “ADD” button.

Now we go back to “Browse” -> “Repositories” -> “hello-world” and click “Access” on the left side.

We then click the “Edit” button and “Add reference” link. We choose the “Owner” option in the “Add permission …” dropdown menu and as group name, we type “hello-world-owner”. We also add the permissions “Submit” and “Label Code-Review” to the same group.

The “Save changes” button will configure that the hello-world-owner group has owner permission on the hello-world project and allows users in this group to do code reviews.

​3.1.​ Pull request

After preparing the configuration we can make our first pull request. Please login as the user2 user. First we set up the project for Gerrit using the following commands:

Now we can open our project and make a change on line 21 to the following:

We have to push the change. We can do that by using the following commands:

If we open the web UI as the user2 user we can see there the review request:

We can open the “First pull request”. On the left side, we can add user1 as a reviewer, by clicking the “ADD REVIEWER” link.

Now to see and approve changes, we log in as the user1 user. We can see the pull request in the list. If we open it we can see all the changes done to the file:

When we are satisfied with the changes, we can accept them by clicking on

And

By submitting we have merged the pull request to the master branch.

​3.2.​ Conclusion

With this basic example, we have shown the core setup of pull requests in Gerrit. In the following chapters, we are going to explore some integrations that can be done with other tools. Pull requests are very powerful but also have downsides. Approvals are usually made by more experienced developers. Their time is precious and too many requests can disrupt their work. To mitigate this, pull requests should be small, the smaller the better. Also, it is important that commit messages are descriptive and offer more context to a reviewer. Additionally, a very important thing is a cultural change. Reviews must not be something people are afraid of, they must be opportunities to learn and improve.

​4.​ Jenkins

Remember the first chapter and explanation of how important it is to use version control even on the smallest projects? Well, I would dare to say that Continuous integration is right after that. The most precious addition one could and should have on the project. A bit of the story for the beginning.

I had the opportunity to work on a big project during my first years of a professional carrier. There we had a Java 6 application, with a relatively huge code base split over 8-10 eclipse projects. What was very interesting about it, was how eclipse generously connected those projects together. Eclipse would detect changes, recompile and redeploy automatically. Quite a nice setup on the first sight. But there was a monster hiding behind. Every deployment to any environment required that developer exports a WAR file from within his IDE and manually copies it to the target environment. This could be quite a headache because sometimes people deployed their local changes accidentally on production. We learned quite fast that there are tools that can help us escape this hell we were in. The problem was, we had libraries laying around projects without version, without proper names. It took quite an effort to find the proper setup to produce matching WAR files.

Tools like Maven give people headaches. But it turns out that most of the problems come from infrastructure that is surrounding developers (network proxies, firewalls, antiviruses). It requires dedication and lots of fighting to get applications like nexus or apache archiva installed with normal internet access.

Again a recommendation, whatever difficulty you have for setting up infrastructure, always use dependency management tools on your project. They will pay off very fast in any project.

When we use dependency management, we find quickly one big hidden game. Our builds can now run outside of the IDE. Actually, they can run anywhere. So that means we can create tools that would build our code automatically! Luckily these tools are already there. Say hello to Jenkins.

Jenkins is the most famous open source continuous integration platform. It offers integration with many source control repositories and build tools. Jenkins will help us build our projects using our dependency management tool. But not only that, Jenkins will help us deploy the code to different environments and do a lot more.

Now we come to something called deployment pipeline. We need some way to define a chain of events which will convert our committed code into value for customers. Since version 2.0, Jenkins has implemented support for pipelines. The pipelines allow storing of development pipeline configuration into a source code repository.

To be more practical let’s add Jenkinsfile to our hello-world project. The Jenkinsfile is expected to be on the root of the repository and is used by Jenkins to create and execute the pipeline. We will put the following Jenkinsfile in the root of our hello-world project. We then commit and push the file as described in previous chapters.

We also need to use build a tool for our project. An easy option is to use Maven. We just add the following content to a pom.xml file in the root of the project and push it together with the Jenkinsfile.

Additionally, we update our docker-compose.yaml file with Jenkins container definition as follows:

After running ‘docker-compose up -d’ command we can access Jenkins on http://127.0.0.1:8081. Jenkins will prompt us with initial setup. By running

docker logs -f jenkins

you can acquire password required to complete the initial setup.

Enter a password from the console and click “Continue”. Then click “Install suggested plugins” button:

We have to wait for the process to finish. Your Jenkins is now ready. Now we need to add our project to Jenkins. We will use the admin user to complete the process. You will be prompted to enter the user data:

Click “Submit” and “Start using Jenkins”. Now we need to configure the tools we need for our build. These are Maven and JDK. We can go to Jenkins homepage and choose “Manage Jenkins” on the left side of the screen. Then we pick the “Global tool configuration”. Configure the following settings:

Save the changes and go back to the homepage by clicking Jenkins icon on the top left corner. You can immediately create a new job by clicking “create new jobs” link:

Enter “hello-world” as job name, choose “Multibranch Pipeline” as a type and click “OK”. Click “Add source”:

Add the url to our Gerrit git repository: http://gerrit:8080/hello-world. By credentials click “Add” and put in admin as the username with HTTP password generated in Gerrit “Settings” for the admin user. Finish setup by clicking the “Save” button. If you open Jenkins homepage and go to the “hello-world” project “master” branch, you will be able to see your build results:

This was the basic Jenkins setup required for continuous integration using Maven. Now we can build our code outside of our IDE. This is just scraping the surface. We can easily build our code automatically on every commit. We can even configure the deployment of the produced artifacts. There is also a possibility to automatically approve Gerrit pull requests only if Jenkins the build is successful. If there is an interest in this topic please let me know to do a follow up article.

Since we are talking about code quality, it is hard to ignore the tools doing code analysis like SonarQube or TeamScale. We learned so far how easy it is to build our code using Jenkins. In the following chapter, we will see how easy it is, now that we have Jenkins and Maven, to include SonarQube in our pipeline.

​5.​ SonarQube

Up to now, we have seen tools that are supporting our effort of clean code. Now we are going in a different direction. We will check the tools that can provide metrics about the code quality. The tool analyzed here is SonarQube. SonarQube is an open source project. It has a predefined set of rules which are run against the source code and provide measurements about possible issues and even estimations in form of technical debt.

SonarQube is integrated with IDEs, using a SonarLint project. SonarLint provides real-time information about possible issues even before the code is committed to the source code repository.

How does this help us to write better code? On one hand, the SonarQube provides information to product managers about the current state of the project so that actionable tasks can be created to mitigate possible issues. On the other hand, developers have insights about their code and can set targets and thresholds for their project.

We are here going to observe simple integration setup with Jenkins using Maven. The first step is to add the SonarQube as docker container to our docker-compose.yaml file with the following lines:

After running ‘docker-compose up -d’, the SonarQube will be available on http://127.0.0.1:9000. Initial credentials are admin/admin.

SonarQube can be integrated in various ways. A simple way to integrate it is as a Maven plugin. We need to add the configuration to our Jenkinsfile in order to run SonarQube as part of the build pipeline. Add the following lines to the Jenkinsfile after the “Build” stage:

Commit and push changes to the master branch as shown before. We can initialize a build in Jenkins by clicking the “Build Now” button in our job.

After a successful build, we will see our new step in Jenkins and results in SonarQube. In SonarQube, open Projects page:

The newly scanned project is available. Now we can see more details into our code quality. In our current implementation there are 3 code smells:

As shown, we get a lot of information about our project. Other than the ones shown, SonarQube can also gather data about code coverage and can be integrated into the code review process. One recommendation I can provide is to integrate SonarQube on day one of the project and make it mandatory to comply with its rules. This is the easiest and sometimes the only way to keep SonarQube less red and the code more manageable.

​6.​ Bring legacy systems in the right way

So far, our recommendation was to start using all the tools in the first steps of the project! Surely, it would be ideal to start the right way immediately when starting the project. But it’s not always like that! We often drag some old legacy systems without proper source control, quality checks or continuous integration processes in place. For those projects, it’s never late to bring them on the right track! Better late than never! The ideal would be to cover them with some tests first, for easier refactoring and modernizing. When tests are in place, there are also some tools that can provide us with some help in modernization. Unfortunately, there is no good open source solution. We can make use of IntelliJ’s capabilities, or use a cheaper option with the same or an even bigger set of capabilities like jSparrow Eclipse plugin, which is also more convenient when using Eclipse. With more modern code, we can calmly continue setting up the modern continuous integration process, delivery and deployment pipeline, and proceed development the right way.

​7.​ Recapitulation

This article described an easy setup of the environment to help and guide you in producing more quality, easily readable and maintainable code. So let’s recapitulate the tools that were used and their benefits!

There is no use case and explanation to not store your code into the source code repository! A version control, like Git, helps you by saving the entire history of the project with all the changes being traceable.

The code review can help prevent bugs by having a second pair of eyes validate the code, maintain coding standards (like formatting), etc.. A person reviewing the code has a completely different perspective and sometimes even more knowledge. Make use of it to learn and improve on a daily basis. But remember to keep it small, the smaller the pull request is, the better. Keep in mind to use it in a positive way and blameless environment!

Continuous integration is right after the source control. Most precious addition one could and should have on the project. Save your time and improve quality by automating everything you can! With Jenkins you can schedule automated builds with automated tests executions and quality checks! Also set up an automatic pull request rejecting, on build failures.

Last, but definitely not the least important, set your helper on a way to quality code with using SonarQube and automatic analysis of your code. With SonarLint integrated into your IDE you can do it with every line you write and produce nicer, more readable and quality code!

Thank you for reading!