Open Source Load Testing Tools
September 29, 2018

Choosing Your Open Source Load Testing Tools

Performance Testing

Is your application, server, or service delivering the appropriate speed of need? How do you know? Are you 100% certain that your latest feature hasn’t triggered a performance degradation or memory leak? There's only one way to verify — and that's by regularly checking the performance of your app.

But which tool should you use for this? In this blog post, we'll review the pros and cons of the leading open-source load testing tools.

📘 Related Resource: Learn more about Open Source Security
 

What Are the Best Open Source Load Testing Tools?

The best open source load testing tools are:

  1. The Grinder
  2. Gatling
  3. Tsung
  4. JMeter
  5. Locust

We’ll cover the main features of the above five tools, show a simple load-test scenario, and display sample reports. In the end, you'll find a comparison matrix to help you decide which tool is best for your project.

Just as a short note, if you are looking for a way to automate these open source tools, BlazeMeter created Taurus, our own open source test automation tool that extends and abstracts most of the above tools (as well as Selenium), and helps to overcome various challenges. Taurus provides a simple way to create, run and analyze performance tests.

The Test Scenarios and Infrastructure

For our comparisons we will use a simple a HTTP GET request from 20 threads with 100,000 iterations. Each tool will send requests as fast as it can.

The server side (application under test):

  • CPU: 4x Xeon L5520 @ 2.27 GHz

  • RAM: 8GB

  • OS: Microsoft Windows Server 2008 R2 x64

  • Application Server: IIS 7.5.7600.16385

The client side (load generator):

  • CPU: 4x Xeon L5520 @ 2.27 GHz

  • RAM: 4GB

  • OS: Ubuntu Server 12.04 64-bit

1. The Grinder

The Grinder is a free Java-based load-testing framework available under a BSD-style open-source license. It was developed by Paco Gomez and is maintained by Philip Aston. Over the years, the community has also contributed many improvements, fixes, and translations. The Grinder consists of:

  • The Grinder Console - This GUI application controls various Grinder agents and monitors results in real time. The console can be used as a basic interactive development environment (IDE) for editing or developing test suites.
  • Grinder Agents - Each of these are headless load generators can have a number of workers to create the load

Key Features of The Grinder:

  1. TCP proxy to record network activity into the Grinder test script
  2. Distributed testing that scales with an the increasing number of agent instances
  3. Power of Python or Closure, combined with any Java API, for test script creation or modification
  4. Flexible parameterization, which includes creating test data on the fly and the ability to use external data sources like files and databases
  5. Post-processing and assertion with full access to test results for correlation and content verification
  6. Support of multiple protocols

The Grinder Console Running a Sample Test

Sample test from The Grinder open-source load testing tool

Grinder Test Results

Grinder test results

2. Gatling

The Gatling Project is another free and open source load testing tool, primarily developed and maintained by Stephane Landelle. Gatling has a basic GUI that's limited to test recorder only. However, the tests can be developed in easily readable/writable domain-specific language (DSL).

Key Features of Gatling:

  1. HTTP Recorder

  2. An expressive self-explanatory DSL for test development

  3. Scala-based

  4. Production of higher load using an asynchronous non-blocking approach

  5. Full support of HTTP(S) protocols and can also be used for JDBC and JMS load testing

  6. Multiple input sources for data-driven tests

  7. Powerful and flexible validation and assertions system

  8. Comprehensive informative load reports

The Gatling Recorder Window:

Gatling Recorder Window

Example Gatling Report for a Load Scenario

Example Gatling Open Source Load Testing tool report

3. Tsung

Tsung (previously known as IDX-Tsunami) is the only non-Java-based open-source performance-testing tool in this review. Tsung relies on Erlang, so you’ll need to have it installed (for Debian/Ubuntu, it’s as simple as "apt-get install erlang”).

Tsung was launched in 2001 by Nicolas Niclausse, who originally implemented a distributed-load-testing solution for Jabber (XMPP). Several months later, support for more protocols was added and, in 2003, Tsung was able to perform HTTP Protocol load testing. Today, it’s a fully functional performance-testing solution with the support of modern protocols like websockets, authentication systems, and databases.

Key Features of Tsung:

  • Inherently distributed design
  • Underlying multithreaded-oriented Erlang architecture simulates thousands of virtual users on mid-range developer machines
  • Support of multiple protocols
  • A test recorder that supports HTTP and Postgres
  • Metrics for operating systems for both the load generator and application under test can be collected via several protocols
  • Dynamic scenarios and mixed behaviors. Flexible load scenarios let you define and combine any number of load patterns in a single test
  • Post processing and correlation
  • External data sources for data driven testing
  • Embedded easily-readable load reports that can be collected and visualized during load

Tsung doesn’t provide a GUI for test development or execution. So you’lll have to live with shell scripts, which are:

  • Tsung-recorder, a bash script that records a utility capable of capturing HTTP and Postgres requests and that creates a Tsung config file from them
  • Tsung, a main bash control script to start/stop/debug and view test status
  • Tsung_stats.pl, a Perl script to generate HTML statistical and graphical reports. It requires the gnuplot and Perl Template library. For Debian/Ubuntu, the commands are:    
    • apt-get install gnuplo
    • apt-get install libtemplate-perl


The main tsung script invocation produces the following output:

Tsung script invocation

Running the test:

Running a load test in Tsung

Querying the current test status:

Tsung load test status

Generating the statistics report with graphs can be done via the tsung_stats.pl script:

Generating Tsung statistics report

Open report.html with your favorite browser to get the load report. A sample statistical and graphical report for a demo scenario is provided below:

A Tsung Statistical Report

A Tsung Graphical Report

4. Apache JMeter

Apache JMeter™ is the only desktop application in this review. It has a user-friendly GUI, making test development and debugging much easier. The earliest version of JMeter available for download is dated March 9, 2001. Since then, JMeter has been widely adopted and is now a popular open-source alternative to proprietary solutions like Silk Performer and LoadRunner. 

JMeter has a modular structure, in which the core is extended by plugins. This means that all implemented protocols and features are plugins that have been developed by the Apache Software Foundation or online contributors.

Key Features of JMeter:

  1. Cross-platform. JMeter can run on any operating system with Java

  2. Scalable. When you need a higher load than a single machine can create, JMeter can execute in a distributed mode, meaning one master JMeter machine controls a number of remote hosts.

  3. Multi-protocol support. The following protocols are all supported out-of-the-box: HTTP, SMTP, POP3, LDAP, JDBC, FTP, JMS, SOAP, TCP

  4. Multiple implementations of pre- and post-processors around sampler. This provides advanced setup, teardown parametrization, and correlation capabilities

  5. Various assertions to define criteria

  6. Multiple built-in and external listeners to visualize and analyze performance test results

  7. Integration with major build and continuous integration systems, making JMeter performance tests part of the full software development life cycle

The JMeter Application With an Aggregated Report on the Load Scenario

5. Locust

Locust is a Python-based open source framework, which enables writing performance scripts in pure Python language. The main uniqueness of this framework is that it was developed by developers and for developers. The main Locust targets are web applications and web-based services, however, if you are comfortable with Python scripting, you can test almost anything you want.

In addition to that, it is worth mentioning that Locust has a completely different way to simulate users, which is fully based on the events approach and gevent coroutine as the backbone for this process. This process allows simulating thousands of users even on a regular laptop, and executing even very complex scenarios that have many steps.

Locust Key Features:

  1. Cross-platform, because Python can be run on any OS
  2. High scalability on regular machines due to events based implementation
  3. Power assertion ability, limited only by your own Python knowledge 
  4. Nice web-based load monitoring
  5. Code-based scripts implementation that is handy to use with version control (Git, Helix Core)
  6. Scalability, because you can run Locust distributed with many agents
  7. The ability to test almost anything with the implementation of custom samplers based on pure Python code

Basic Locust test script example:

fromlocustimportHttpLocust,TaskSet,taskclassSimpleLocustTest(TaskSet):@taskdefget_something(self):self.client.get("/")classLocustTests(HttpLocust):task_set=SimpleLocustTest

 

You can run the script by using this command:

 

locust-flocustfile.py--host=http://192.168.1.170:8080

Running a Locust test script

After the script execution, you will find the detailed reporting on http://localhost:8089/:

 

Locust test report

Comparison of Open-Source Load Testing Tool Results

Let’s compare the load test results of these tools with the following metrics:

  1. Average Response Time (ms)

  2. Average Throughput (requests/second)

  3. Total Test Execution Time (minutes)

First, let’s look at the average response and total test execution times:

average response and total test execution times for open source load testing tools

comparing open source load testing tools average throughput

As shown in the graphs, Locust has the fastest response times with the highest average throughout, followed by JMeter, Tsung and Gatling. The Grinder has the slowest times with the lowest average throughput.

Open Source Load Testing Tool Feature Comparison 

And finally, here’s a comparison table of the key features offered by each testing tool:

 

FeatureThe GrinderGatling   TsungJMeterLocust
OSAnyAnyLinux/UnixAnyAny
GUIConsole Only Recorder OnlyNoFullNo
Test RecorderTCP (including HTTP)HTTPHTTP, PostgresHTTPNo
Test LanguagePython, ClojureScalaXMLXMLPython
  Extension LanguagePython, ClojureScalaErlangJava, Beanshell, Javascript, JexlPython
Load ReportsConsoleHTMLHTMLCSV, XML, Embedded Tables, Graphs, PluginsHTML
Protocols

HTTP

SOAP

 JDBC

POP3

SMTP

 LDAP

JMS

HTTP

JDBC

JMS

HTTP

 WebDAV

Postgres

MySQL

XMPP

 WebSocket

AMQP

MQTT

LDAP

HTTP

FTP

JDBC

SOAP

LDAP

TCP

JMS

SMTP

POP3

IMAP

HTTP
Host monitoringNoNo YesYes with PerfMon pluginNo
Limitations

Python knowledge required for test development & editing.

Reports are very plain and brief.

Limited support of protocols.

Scala-based DSL language knowlegde required.

Does not scale.

Tested and supported only on Linux systems.Bundled reporting isn’t easy to interpret.Python knowledge required for test development & editing.

 

BlazeMeter vs. JMeter? How These Tools Work Together 

While Apache JMeter represents a strong and compelling way to perform load testing, of course, we recommend supplementing that tool with BlazeMeter Load Testing Cloud, which lets you simulate up to 1 million users in a single developer-friendly, self-service platform.  With BlazeMeter, you can test the performance of any mobile app, website, or API in under 10 minutes. Far from comparing BlazeMeter vs. JMeter, here is why the combination of JMeter and BlazeMeter is an attractive choice for developers:

Simple Scalability – It’s easy to create large-scale JMeter tests. You can run far larger loads far more easily with BlazeMeter than you could with an in-house lab.
Rapid-Start Deployment – BlazeMeter’s recorder helps you get started with JMeter right away, and BlazeMeter also provides complete tutorials and tips.
Web-Based Interactive Reports – You can easily share results across distributed teams and overcome the limitations of JMeter’s standalone UI.
Built-In Intelligence – The BlazeMeter Cloud provides on-demand geographic distribution of load generation, including built-in CDN-aware testing.

BlazeMeter load testing vs. JMeter open-source load testing

Start testing now! To try out BlazeMeter, which enhances JMeter features. To run Locust, Gatling, The Grinder, and Tsung automatically and more easily, try out Taurus.

This blog was originally published on September 29, 2018, and has since been updated for accuracy and relevance.

START TESTING NOW

 

Related Resources