Run massively scalable performance tests on web, mobile, and APIs

Request a Demo
Oct. 9th, 2018

Monitoring UX Metrics in HLS Load Testing in JMeter

The BlazeMeter blog contains a number of articles dedicated to load testing HLS services. Since HLS is based on the HTTP protocol, using JMeter’s HTTP sampler to implement a load testing script for HLS service is a straightforward approach.

 

Here is a brief description of this implementation, which you can also read about here: the first HTTP sampler gets a playlist that contains URLs of media chunks, which constitute an original media file. Each chunk is downloaded using other HTTP samplers that iterate over the list.

 

Another, more simple approach is to use the HLS sampler, which is a Apache JMeter™ component that simplifies the script logic of processing playlists. The HLS sampler does all the work of receiving playlists and downloading media chunks. This is helpful in more complex cases where there is a master playlist that contains the information about variant playlists that describe different sets of media chunks, each representing original media files with various resolutions, bitrates and qualities. The HLS sampler processes these cases, imitating the client behavior for different video types, network bitrates and client resolutions. This blog post provides a guide on how to use the HLS sampler.

 

Both approaches have their advantages. Because the primary goal of load testing is to get reports about the performance of the system under the test, JMeter listeners can be used to display various performance related metrics. This blog post highlights the advantages and disadvantages of each approach, including usage of aggregate JMeter listeners.

 

But in addition to performance related metrics, there are user experience related metrics that have to be taken into account when conducting load testing of HLS services. These metrics are:

  • Buffer fill time - the time users wait for the video to start playing. During this time they get a progress roller and the first seconds of the video are downloaded. It’s an important metric to check, as users may not want to wait minutes for the video to start.
  • Lag time - the time a user is waiting for the data to be buffered during the playback. This negatively impacts user experience, so it's necessary to ensure that lag time is acceptable, according to SLA.
  • Download time - the time required to download all HLS streams artifacts: playlists, and media chunks.
  • Play time - the real playback time including lags.
  • Lag ratio - equal to lag time divided by the video total duration. A lower lag time ratio is better.

 

This blog post explains how to calculate these metrics in both approaches of JMeter script implementation.

 

Testing HLS in JMeter

 

The HLS protocol was developed by Apple, and the Apple developers’ site contains both the description of the HLS protocol and the description of the HLS playlist format. The description of the HLS protocol can be found in RFC8216 as well. Basically a HLS playlist contains the URLs of media chunks, which constitute the original media file. Often, when the media file is represented in a few resolutions, there is a master playlist and variant playlists.

 

The master playlist contains URLs of variant playlists and variant playlists contain URLs of chunks, which constitute media streams with different resolutions and bitrates. Each chunk of the variant playlist has a special #EXT-INF tag, which specifies the playback duration in seconds of the chunk. The screenshot below shows what the variant playlist looks like in JMeter.

 

hls testing, jmeter

 

You can see that each string related to the chunk URL has an #EXT-INF tag with the duration value. The other useful parameters that can be parsed from master playlist are bandwidth and resolution. The screenshot of the master playlist is shown below.

 

hls testing with groovy, jmeter

 

Approach 1: Using JMeter’s HTTP Sampler

 

In the screenshot above you can see that the #EXT-X-STREAM-INF tag defines bandwidth and resolution parameters for the variant playlist shown below. Using these values, it’s possible to evaluate the UX metrics enlisted above. The JMeter script that calculates these metrics can be downloaded from here.

 

The provided script is not a universal solution, because for different types of streams and configurations of playlists, different processing may be required. But, this script can give you an idea about how these metrics can be evaluated. This script processes an open source HLS stream. It doesn’t use the HLS sampler. Instead, it uses HTTP requests to get playlists and to receive media chunks. JSR223 post processors for HTTP requests monitor the time required for downloading each media chunk and evaluate user experience metrics.

 

Now, let’s elaborate more on how each metric listed above is evaluated. All metrics are calculated separately for each thread and are stored in corresponding variables. All metrics are declared in the user defined component, that is demonstrated on the screenshots below.

 

ux metrics for jmeter hls testing

 

The calculation of the buffer fill time, lag time, download time and playback time is done in the JSR223 postprocessor, and its Groovy script is shown on the screenshot below, as well as in code further down in this blog post.

 

ux metrics, jmeter, groovy

 

Buffer Fill Time

 

The Buffer fill time metric is the time required to download one or a few chunks before the HLS player starts to playback the media. Usually, the client sees the progress roller during this time. In the JMeter script, which imitates a HLS player, the buffer size is defined in kilobytes, as the user defined variable ${bufferSize}.

 

The time required to fill the buffer is equal to time it takes to download playlists and media chunks, so that the amount of downloaded bytes of chunks equals or exceeds the buffer size.

 

// Buffer is filled, until bytes in buffer exceeds defined value - buffer size.
	if(bytesInBuffer < Long.parseLong(args[1])){
		bytesInBuffer = bytesInBuffer + dB;
		bufferFillTime = bufferFillTime + dT;
//Initial filling of the buffer is not included in the lag
		if(totalLT > 0)
			totalLT = totalLT + dT;
	}

 

The piece of code from the JSR223 postprocessor above shows the calculation of the buffer fill time variable.

 

Lag Time

 

lag time, hls testing, blazemeter

 

Lag time is calculated by using the duration value that is specified for each media chunk in the variant playlist. The duration value is the time required for the playback of the downloaded chunk. This value is parsed from the variant playlist and passed as a parameter to the JSR223 postprocessor. The lag is calculated after each chunk is downloaded. If the total time spent on the download of media chunks exceeds the time required for the playback of the downloaded chunks, the lag occurs. In other words, lag time is calculated in the script as the difference between play time and download time variables. The total lag time is accumulated in the corresponding variable.

 

The piece of Groovy code from the JSR223 postprocessor that shows the calculation of the lag time is shown below.

 

	mediaDT = mediaDT + dT;
// If total time for download exceeds time for playback - lag occurs
		if(1000*mediaPT < mediaDT){
			totalLT = totalLT + mediaDT - 1000*mediaPT;
			bytesInBuffer = 0;
			mediaPT = 0;
			mediaDT = 0;
		}

 

The JSR223 assertion is added to the script to catch the lag time appearance. If lag time occurs, the corresponding HTTP request is marked as failed. The screenshot of the JSR223 assertion is shown below.

 

jmeter, hls testing, groovy, ux metrics

 

Download Time

 

In the JSR223 post processor, the download time is accumulated in the corresponding variable every time a new chunk is downloaded. The time required for downloading is taken from the response.

 

The piece of the JSR223 postprocessor Groovy code that calculates download time and play time is shown below.

 

//Download time - time required to download a chunk. Play time - time required for the playback of downloaded chunk, parsed from the playlist
	totalDT = totalDT + dT;
	totalPT = totalPT + Double.parseDouble(args[0]);

 

Play Time

 

Total play time is calculated in the JSR223 sampler, which is shown in the screenshot below. The play time is calculated as the sum of time required to playback downloaded chunks and the lag time.

 

lag time, jmeter, load testing, performance testing, ux metrics

 

Lag Ratio

 

The lag time ratio is calculated by the division of lag time by play time for each thread. As you remember, these values are stored in the corresponding variables. The lag time is calculated in the JSR223 sampler, the corresponding screenshot of which is shown above.

 

Approach 2: Using the HLS Sampler

 

The same algorithm for calculating user experience metrics can be used when implementing the HLS sampler approach in the JMeter script, which you can find here. As mentioned, scripting is simpler with the HLS sampler because one HLS sampler can do the whole job of retrieving HLS playlists, parsing them and downloading all the chunks from the playlist. The HLS sampler can also be adapted to various network throughput and end client behaviour.

 

As in the first approach of building a JMeter script, the user experience metrics are calculated in the JSR223 postprocessor and defined in the user defined variables component. The screenshot of the JMeter script with an expanded JSR223 postprocessor is shown below.

 

using the hls sampler in jmeter, ux metrics

 

The algorithm of the calculation of the UX metrics is the same as in the first approach of the JMeter script design. The JSR223 postprocessor java code is below.

 

import java.util.regex.Matcher;
	import java.util.regex.Pattern;
//Restoring variables	
	double bufferFillTime = Double.parseDouble(vars.get("bufferFillTime"));
	double totalDT = Double.parseDouble(vars.get("totalDT"));
	double totalPT = Double.parseDouble(vars.get("totalPT"));
	double totalLT = Double.parseDouble(vars.get("totalLT"));
	double totalPTLT = Double.parseDouble(vars.get("totalLT"));
	long bytesInBuffer = Long.parseLong(vars.get("bytesInBuffer"));
	double lagTimeRatio = Double.parseDouble(vars.get("lagTimeRatio"));
//We use map to store chunk file with their playback duration values
	Map chunkMap = new HashMap();
	Pattern responsePattern = Pattern.compile(args[1]);
// Download time of the master playlist
	double dT = (prev.getEndTime() - prev.getStartTime());
	totalDT = dT;
	totalPT = dT;
	bufferFillTime = dT;

	long dB = 0;
	double mediaPT = 0;
	double mediaDT = 0;
	SampleResult[] sR = prev.getSubResults();

	for(sR0:sR){
//Download time of the chunk playlists
		dT = dT+(sR0.getEndTime() - sR0.getStartTime()));
		totalDT = totalDT+dT;
		bufferFillTime = bufferFillTime + dT;

		String strResponse = sR0.getResponseDataAsString();
//Parsing variant playlist to extract chunk files with duration values
		Matcher matchedString = responsePattern.matcher(strResponse);
		while(matchedString.find())
			chunkMap.put(matchedString.group(2, matchedString.group(1)));
		
		SampleResult[] sR1 = sR0.getSubResults();

		for(sR2:sR1){
			dB = sR2.getBytesAsLong();
			dT = (sR2.getEndTime() - sR2.getStartTime());
			String sampleLabel = sR2.getSampleLabel();
//Getting chunk playback duration from the map, by chunk name
			double chunkDuration = Double.parseDouble(chunkMap.get(sampleLabel));
			mediaPT = mediaPT + chunkDuration;
// Buffer is filled, until bytes in buffer exceeds defined value - buffer size.
			if(bytesInBuffer < Long.parseLong(args[0])){
				bytesInBuffer = bytesInBuffer + dB;
				bufferFillTime = bufferFillTime + dT;
				if(totalLT > 0)
					totalLT = totalLT + dT;
			}
			else{
				mediaDT = mediaDT + dT;
// If total time for download exceeds time for playback - lag occurs
				if(1000*mediaPT < mediaDT){
					totalLT = totalLT + mediaDT - 1000*mediaPT;
					bytesInBuffer = 0;
					mediaPT = 0;
					mediaDT = 0;
				}
			}
//Download time - time required to download a chunk. Play time - time required for the playback of downloaded chunk, parsed from the playlist
			totalDT = totalDT + dT;
			totalPT = totalPT + chunkDuration;
		}
	}
// Total play time, including lags
	totalPTLT = 1000*totalPT + totalLT;
// Lag time ratio
	lagTimeRatio = totalLT / totalPTLT;

	vars.put("bufferFillTime",Double.toString(bufferFillTime));
	vars.put("totalPT",Double.toString(totalPT));
	vars.put("totalDT",Double.toString(totalDT));
	vars.put("totalLT",Double.toString(totalLT));
	vars.put("bytesInBuffer",Long.toString(bytesInBuffer));
	vars.put("mediaPT",Double.toString(mediaPT));
	vars.put("mediaDT",Double.toString(mediaDT));
	vars.put("lagTimeRatio",Double.toString(lagTimeRatio));
	vars.put("totalPTLT",Double.toString(totalPTLT));

 

Let’s recap: this article demonstrated the evaluation of user experience metrics in two approaches for creating JMeter scripts creation for HLS testing. The demonstrated method provides one of the solutions to calculate these metrics and use them further in the script, but there are more methods. For example, different components like the Java sampler can be used to build a JMeter script that imitates a client. Commercial plugins like Ubik can also be used to build a JMeter script. In all these cases, we recommend you calculate metrics to evaluate user experience during the load testing of HLS services.

 

Want more information on load testing with JMeter? Sign up for our free JMeter online training course.  Try out BlazeMeter by putting your JMX file or URL in the box below and your test will start in minutes.

 

Learn more about performance testing HLS from our free "Using JMeter for Performance Testing HLS Video Delivery" webinar.

     
arrow Please enter a valid URL

Interested in writing for our Blog? Send us a pitch!