Dmitri Tikhanski is a Contributing Writer to the BlazeMeter blog.

Learn JMeter in 5 Hours

Start Learning
Slack

Test Your Website Performance NOW!

arrow Please enter a URL with http(s)
Jun 10 2016
日本語

How to Use JMeter Assertions in Three Easy Steps

 

JMeter Assertions are undeniably valuable, providing the criteria you set to determine whether the a will be considered a “pass.”  You can use them to run against a sample and its subsamples to ensure returned values match expected results, and you can also apply them to JMeter Variables.

 

But are you using assertions efficiently? Do you know the pitfalls to avoid? This article takes you through three key steps for using assertions, including factors to consider when setting them, examples of commonly used assertions, and ways to view results.

 

1. Considerations  Before Setting Assertions

 

The Cost of JMeter Assertions

 

All assertions come with a cost, in terms of CPU or memory consumption. However, some assertions carry a greater cost than others. According to the JMeter Performance and Tuning Tips guide, the Response Assertion and the Duration Assertion are typically lower-impact choices, whereas Compare Assertion and other XML-based ones like the XPath Assertion consume more CPU and memory.

 

The Scope of JMeter Assertions

 

You must also consider the scope when setting assertions. Assertions can be applied to a main sample and its subsamples, or only to subsamples. Some assertions, like the Response Assertion or the Size Assertion, can also be used against a JMeter Variable. Code-based assertions (such as Beanshell, BSF and JSR223) don’t have the GUI element that identifies scope. This means you must manually implement all assertion logic – including scope.


An assertion can apply to samples on the same level (greater scope) or to parent samples (lesser scope), as shown below:


As mentioned in the post Using JMeter's Transaction Controller, assertions that fail cause the whole Transaction Controller to fail, use care when applying these assertions.

 

Combining Assertions

 

You can add more than one assertion to the sampler, controller, thread group, or test plan. Failed assertions will cause all affected samples to fail, so caution is essential.

 

2. Commonly Used Assertions & Their Uses

 

Response Assertions

 

The most commonly used assertion is the Response Assertion, which checks whether a response text/body/code/message/header contains, matches, or equals a specified pattern.

 

The Pattern can be either be:

 

  1. a “string” for “Equals” or “Substring” clauses
  2. a “Perl5-style” Regular Expression for “Contains” or “Matches” clauses

 

Response Entities that can be checked include:

 

  1. Text response - This is for the response that can be displayed in a browser
  2. Document (text) - This is for anything supported by Apache Tika (it assumes the presence of apache-tika.jar in /lib folder of a JMeter installation). This can include PDF, Office, audio, and video formats. Be careful, because this can be memory-intensive for high loads.
  3. URL Sampled - This assertion is used against a requested URL to ensure it matches expectations. For example, you may want to check that the redirect URL doesn’t contain an error somewhere in the path.
  4. Response Code - This checks to ensure the response code is expected. For 4xx and 5xx response codes, make sure you have checked the “Ignore Status” box (see below for a full explanation).
  5. Response Message - This verifies that the response message appears as  expected.
  6. Response Headers - This is used against Response Headers to see if a specific HTTP header is present or absent.
  7. Ignore Status - JMeter out-of-the-box considers all 4xx and 5xx responses as failures. If your test case is negative and, for example, a 404 error is expected, you’ll check this box to suppress JMeter’s built-in status code check and substitute it with your own status code assertion.

JMeter 3.2 now enables running assertions against Request Headers.

 

Duration Assertion

 

The Duration Assertion is very simple. Used alongside the Response Assertion, it covers 90 percent of use cases where assertions are required. The usage is very straightforward: It provides the maximum duration in milliseconds, and, if any request lasts longer than the value specified, the sample is marked as failed. When you get a Duration Assertion failure, the output appears like this:



 

Size Assertion

 

Size Assertion checks the response length to see if it’s equal/not equal/greater/less than the expected size in bytes. It can be applied to:

 

  1. Full response (body and headers)
  2. Response headers
  3. Response body
  4. Response code
  5. Response message

 

The easiest way to check the response size is through the “View Results Tree Listener” (discussed later – See Tracking Results.) Here’s an example of the sample output: 

 

 

 

For the Full Response assertion, it should be testing “=” comparison type and 1591 bytes

 

  1. For the Response Body - 1270 bytes
  2. For the Response Headers - 321 bytes


 


XML Assertion

 

The XML Assertion checks that the returned response body is XML-compliant. Only the syntax is checked. Any external resources are neither downloaded nor validated. When there is invalid XML code, the reason for failure will be reported in an ‘Assertion Failure’ message.


For example, the XML test below has an unclosed root tag (<note>) on the last line:

 

<?xml version="1.0"?>

<note>

   <to>Tove</to>

   <from>Jani</from>

   <heading>Reminder</heading>

   <body>Don't forget me this weekend!</body>

/note>

 

^

So the XML Assertion fails all affected samplers and reports the reason in an assertion failure message. See below for an example of this message:  

 

Beanshell Assertion

 

The Beanshell Assertion allows you to perform additional checks on a sampler using Beanshell scripting. For more about JMeter API shorthands available in Beanshell scripts, consult How to use BeanShell: JMeter's favorite built-in component. In addition, the Beanshell Assertion offers the following variables:

 

  1. Failure - boolean (true|false). This indicates whether the sampler is considered successful.
  2. FailureMessage - string. This is a custom message displayed as an assertion failure message.
  3. ResponseData - byte. A byte array representing response data.
  4. ResponseCode - string. This represents response code.
  5. ResponseMessage - string. This holds the response message.
  6. ResponseHeaders - string. This contains the response headers.
  7. SampleResult - org.apache.jmeter.samplers.SampleResultThis is a JMeter SampleResult class instance that contains results for preceding sampler(s). When there are multiple parent samplers (i.e. “Transaction Controller” or “‘Retrieve all Embedded resources”), this method returns an array of all nested requests.

 

For example, the following Beanshell Assertion code snippet will return an error if the word “blazemeter” does not appear in the URL path:

 

String path = SampleResult.getURL().getPath();

if (!path.contains("blazemeter")) {
   Failure = true;
   FailureMessage = "URL Path: didn't contain \"blazemeter\"" + System.getProperty("line.separator") + "URL Path detected: " + path;

}

 

 

MD5Hex Assertion

 

The MD5Hex assertion checks the MD5 checksum of the actual response against the expected MD5 hash. Content of any length, whether it’s one character or a full HD video file, will be represented as a 32-digit hexadecimal number. It is particularly useful for large data-integrity checks. If you need to test the file-download performance and run checks for the content of downloaded files, you can avoid storing megabytes of data in memory by running MD5 hashes only assertion checks.

 

There are a number of online services and applications for calculating MD5 checksums:

 

  1. WinMD5Free - for Windows
  2. md5sum - for Linux and Unix
  3. md5 - for MacOSX

 

You can also calculate MD5 Hex using the JMeter Beanshell scripting extension. Here’s an example of a Beanshell code:

 

import org.apache.commons.codec.digest.DigestUtils; // necessary class import

String toMD5 = "blazemeter";   // source data (can be byte array, String or InputStream)

String md5Hex = DigestUtils.md5Hex(toMD5); // calculate MD5 checksum

log.info(md5Hex); // print MD5 checksum to jmeter.log

 

This code, called from a Beanshell-enabled test element, will produce the following line:

 

INFO  - jmeter.util.BeanShellTestElement: d21529e5b05b406d4c3c5235978f2a18

 

Where d21529e5b05b406d4c3c5235978f2a18 is MD5 hex for the “blazemeter” string. The MD5Hex assertion cannot be applied to an empty response. If an empty response occurs, the sample will failed. The following is an example of an MD5Hex Assertion error output:

 

 

HTML Assertion

 

The HTML Assertion checks that the response HTML syntax is a well-formed HTML/XHTML/XML document. So it’s handy if your Web application or site requires zero HTML validation errors. Most modern browsers render even invalid HTML pages, but search engine robots or third-party integrations may not be so tolerant.

 

The official documentation on HTML Assertion is pretty comprehensive. When it comes to  Assertion Results Visualization, reports will only display a limited number of warnings and errors. To view a full report, you must provide something in the “Write JTidy report to file” input to get the exact position and problem description as shown in the example below:

 

line 6 column 5 - Warning: meta lacks "content" attribute

InputStream: Doctype given is ""

InputStream: Document content looks like HTML 4.01

1 warning, no errors were found!

 

 

XPath Assertion

 

The XPath Assertion allows an XPath evaluation against a Web server response to ensure the specified entity is present or an element attribute value matches expectations. For more information on how to use XPath for correlation, consult Using the XPath Extractor in JMeter. The most appropriate use case for XPath Assertion is testing SOAP Web Services XML responses. The same nuances for XPath Extractor are applied to XPath assertion:

 

  1. If the response is not XML/XHTML compliant, it’s required to check “Use Tidy.”
  2. If the response uses external DTDs, the relevant box should be checked.
  3. If namespaces are being used, they must be provided via “xpath.namespace.config” property.

 

To give an idea as to how it should work, here are a few assertions on the example.com domain:


//title/text()='Example Domain'- checks <title> tag text to be “Example Domain” - will PASS

count(//a)=2 - checks that there are 2 links (<a>) on the page - will FAIL

//meta/@charset='UTF-8' - checks that <meta> tag  “charset” attribute equals UTF-8 will FAIL

 

XML Schema Assertion


The XML Schema Assertion checks whether the XML response matches the specific XSD schema provided. When running tests with BlazeMeter, just provide a reference schema file along with the test script.

 

BSF and JSR223 Assertions

 

The BSF Assertion and JSR223 Assertion use cases are the same as the Beanshell assertion.  The only difference is performance. JSR223, in combination with Groovy language, gives almost the same performance as native Java code and Beanshell, whereas other languages have performance constraints. If your scripting assertion code is “heavy” enough, consider using JSR223 Sampler and Groovy. For more details, see the Beanshell vs JSR223 vs Java JMeter Scripting: The Performance-Off You've Been Waiting For! comparison benchmark.

 

Compare Assertion

 

The Compare Assertion checks the response content or confirms that the response time of all samplers under the assertion scope are equal. The requests and the assertion should be on the same level in the test plan.

 

Assertion parameters:

1. Compare Content | TRUE|FALSE

  • If TRUE, content of all affected samplers will be checked to confirm it is identical. Any difference will cause assertion failure.
  • If FALSE, content check will be omitted

2. Compare Time | -1 | 0 | number

  • If -1 - response time check will be omitted
  • If 0 - response time of all affected samplers will be checked to confirm that are identical. Any difference, even 1 ms, will cause assertion failure
  • If >0, response time of all affected samplers will be checked to be not different than the value provided. If the threshold is exceeded, the assertion will be omitted


If both Content and Time checks are specified, the Time check will take precedence.

 

As already mentioned, avoid using this assertion for high loads, since it consumes a lot of CPU and RAM, which can ruin your test and take up a lot of your valuable time.

 

SMIME Assertion


The SMIME Assertion checks whether a response returned from the Mail Reader Sampler is signed. An alternative clause (message is unsigned) can also be specified, regardless of whether the assertion should verify the signer certificate. This assertion requires third-party libraries, so be sure you have following libraries in your JMeter classpath:

  1. bcmail-*.jar
  2. bcprov-*.jar
  3. bcpkix-*.jar

 

These three jars must be downloaded from the Bouncy Castle download area. Be sure that JDK for Bouncy Castle libraries matches JMeter’s current JDK requirements. If you’re using Blazemeter, just drop the files to the File Upload area along with your .jmx script, as shown in the screenshot below:

 

 

 

As you can for any other assertion, you can find all the information for failing assertions in the Errors tab of the Load Report.



 

 

3. Viewing the Results of Your Assertions


Assertion Results Visualization


So we have a sample and an assertion to test the response - but how do we see what’s wrong with the response?

In the GUI mode, there are two ways that failed assertions can be inspected:
 

  1. Assertion Results Listener.  This reveals the label under which all the assertions were taken.


 

 

  1. View Results Tree Listener.  This reveals all the assertions in the test plan.

 

 

 

Caution: Both consume significant amounts of memory, so don’t use them during load tests. They should only be used for debugging or opening a .jtl file generated by non-GUI run. For non-GUI mode, the following properties are available:

 

  1. jmeter.save.saveservice.assertions=true | false
  2. jmeter.save.saveservice.assertion_results_failure_message=true | false
  3. jmeter.save.saveservice.assertion_results= none | first | all

 

Command Line Mode

 

 

BlazeMeter Assertions

 

The Failed Assertions report can be found under the Errors tab of the BlazeMeter load report interface.

 

 

Response Assertion Report Fields

 

Three major fields are recorded:

  1. Assertion Error (true|false) - This indicates whether the assertion succeeded. For example, the Assertion Error will be true if there is a problem with the assertion, such as an incorrect Beanshell script in the Beanshell Assertion, or the “size in bytes” is not provided for the Size Assertion. An assertion error causes the affected sample(s) to fail.

  2. Assertion Failure (true|false) - This indicates whether an assertion is successful. If the actual assertion result matches the expected result, it will be true, otherwise it will be false. If the Assertion Failure = false, affected sampler(s) will be considered failed.

  3. Assertion Failure Message (string) - This is a built-in or custom message that clarifies the details of the assertion failure.


LEARN MORE JMETER

 

If you are new to JMeter, and you’d like to learn more, please sign up for our free online JMeter training course.

 

For more experienced JMeter users, you'll want to view the on-demand webcast, How to Create Advanced Load Testing Scenarios with JMeter

 

Be sure to read through all our JMeter list of resources

 

How the BlazeMeter Load Testing Cloud Complements and Strengthens JMeter

 

While JMeter represents a strong and compelling way to perform load testing, of course, we recommend supplementing that tool with BlazeMeter, which lets you simulate up to millions of users in a single developer-friendly, self-service platform.  With BlazeMeter, you can test the performance of any mobile app, website, or API in under 10 minutes.  Here’s why we think the BlazeMeter/JMeter combination is attractive to developers:

  • Simple Scalability – It’s easy to create large-scale JMeter tests. You can run far larger loads far more easily with BlazeMeter than you could with an in-house lab.
  • Rapid-Start Deployment – BlazeMeter’s recorder helps you get started with JMeter right away, and BlazeMeter also provides complete tutorials and tips.
  • Web-Based Interactive Reports – You can easily share results across distributed teams and overcome the limitations of JMeter’s standalone UI.
  • Built-In Intelligence – BlazeMeter provides on-demand geographic distribution of load generation, including built-in CDN-aware testing.

 

enhance jmeter

 

To try out BlazeMeter, request a demo, or put your URL in the box below and your test will start in minutes.

     
arrow Please enter a URL with http(s)

Interested in writing for our Blog?Send us a pitch!