Run BlazeMeter Performance Tests in an XL Release CD Pipeline
As a DFL DevOps Director at Nedbank, one of my main focuses is integrating and automating the tool chain, from Continuous Integration to Continuous Deployment. The goal is to eliminate manual tasks and promote a hands off cycle, so as to achieve economies of scale, accelerate time to market and improve quality. This blog post will explain how we added BlazeMeter performance testing to our automated XL Release CD cycle to achieve continuous testing, and how you can do it yourself.
Automating Our Continuous Deployment Pipeline
At Nedbank, the CD tools that we use are by XebiaLabs, a DevOps tool chain vendor. We use XL Release, for orchestrating the release pipeline, from development to production. We also use XL Deploy, an automation tool for deploying artifacts to systems like host containers, virtual machines and the cloud. CI is handled through a different tools chain, which includes Jenkins, BitBucket and Jira. The CI pipeline is tightly integrated with our CD pipeline.
We started ramping up our automation process by identifying the manual steps in our pipeline, with the goal of automating them. At that point, we looked at BlazeMeter to shift left our performance testing capabilities, earlier and more often.
However, even though XL Release did have a plugin that allowed orchestrating Performance Testing as part of the pipeline, there wasn’t a community developed plugin for BlazeMeter and XL Release. But, thanks to the mature and well-defined BlazeMeter API layer, we took a stab and tried to develop an integration plugin ourselves. By doing so, we were able to add performance testing automation to our CD pipeline. We contributed the XL Release BlazeMeter plugin to the community and you can find it here. You can skip ahead to the next section to see how to use it.
Here is what our CD pipeline looks like in XL Release today:
As you can see, our CD Pipeline includes four steps:
1. Pre-Release: a housekeeping phase we run before deployment. The purpose of this step is to ensure all of our tasks are in the lifecycle. For example, we use this step to ensure pull-requests are associated with Jira tasks.
2. Development: The first environment for deploying applications. We use Ansible to set up the environment from scratch every time, we integrate with Slack to get notifications on our channels, and we use XL Deploy for code and artifact deployment.
3. ETE: The end-to-end testing environment. This phase includes two BlazeMeter tasks. First, we take generated test data (from the pre-release phase) and upload it to BlazeMeter. This data includes authorization tokens and other test data. The second phase is the test itself, which in this case is a functional API test. In addition this step has acceptance tests, Jira status updates and more.
4. QA: A replica of production, which includes running a full scale BlazeMeter performance tests with thresholds that will fail the deployment pipeline if we don’t meet certain performance criteria. Because we can do it more often (daily) we know exactly what commits are impacting performance, giving us faster feedback and the ability to remediate it ASAP.
Our pipeline is fully automated and hands-off. Now let’s see how you can use it yourself.
How to Use the XL Release BlazeMeter Plugin
2. Copy the plugin JAR file into the SERVER_HOME/plugins directory of XL Release.
3. Configure your BlazeMeter URL and API Key in the Shared Configuration.
4. Go to XL Release.
5. Add a Test Phase and choose the ‘BlazeMeter’ plugin from the dropdown menu.
6. Fill in your test’s details: API key, Test ID, Workspace ID, polling interval, etc. set your polling interval according to the length of the test.
IMPORTANT: Don’t forget to configure your test in BlazeMeter.
7. When running your test, you can open it up and get a link to a URL that will take you to the test results in BlazeMeter:
API Functional Test:
We use these reports for our own analysis, and also for showing the product team where we are.
As I mentioned, these tests are automatically run a few times a day, every time we add new code. So, we are agile by running continuous testing as part of our Continuous Deployment and Continuous Integration process.