Implementing end-to-end mobile performance testing with BlazeMeter and Apptim
With app quality being more important than ever for app success, brand reputation, and ultimately, revenue, end-to-end performance testing becomes critical for the full and comprehensive assessment of real-world conditions that an application may face.
While performance is a product of multiple factors: the server, mobile device, network, and programming of the application itself, it becomes crucial for developers to know how the application will behave on different devices, operating systems and screen sizes, which requires measuring and analyzing the application performance at different levels.
Performance testing is hard, and mobile performance testing is even harder.
It is also not easy to have complete visibility and observability of what happens between a user action in mobile and the response time from the server, and all what happens in the middle. Mobile users can use an app while connected to different networks and under different conditions, which also affects the perceived performance and the UX.
This is harder than web performance, because even though you could travel with your laptop using the train’s WiFi, when you are on a mobile app, user expectations are higher, and you would expect the app to be fast and reliable under different network conditions.
For that reason, it’s important to know how to select the right combination of tools and methodologies that developers and testers can use to conduct mobile performance tests.
Business flow to test
We’re going to test the same business flow “OrderMobile” on a website on desktop, a website on mobile and in a native mobile app. The “OrderMobile” business flow consists of 4 simple steps:
- Navigate to homepage: https://demoblaze.com/
- Select Category “Phones”
- From the resulting list of products, select “Iphone 6 32gb”
- Add product to cart
We’ll use the demo site for both the web and backend tests as well as a demo iOS app for mobile native tests.
Firstly, we will create some API tests for OrderMobile that we will use to stress the backend. This can be easily created as an API Functional Test inside BlazeMeter by uploading your existing API functional tests or creating one interactively in the UI.
We run this test once to make sure everything works correctly and get the following results:
To test the user experience on both desktop and mobile, we’ll create some GUI functional tests.
Let’s start by creating functional desktop web tests using Taurus YAML Selenium scripts. We can write this script by hand or with the BlazeMeter Chrome Extension. We will use this script to test the front end functionality of our application in a web browser on desktop.
When we run this test in BlazeMeter, we can see what happened under the hood, looking at the waterfall view with the network requests:
Then, we’ll create our GUI performance tests for our web app in Chrome Android and our native iOS app. We’ll do this by using a Taurus YAML Apptim script and the Taurus plugin, bzt-apptim. This is a plugin for Taurus developed by the Apptim team that allows you to run Appium tests (Native, Hybrid or Web) over AWS Device Farm and collect performance data of both the mobile application and the device where it runs.
To test the front end functionality of our application in a Chrome browser on Android, we will create a test namedmobile_web_test.yml with the following content:
modules : blazemeter: report-name: OrderMobile - Mobile Browser project: Default project test: OrderMobile - Mobile Browser execution: - executor: apptim test-type: web-android # Allowed web-android or web-ios test-file: demoblaze.zip # Appium Test file with dependencies. test-runner: testng # Allowed runners testng or junit test-devices: # List of devices to run test - device: # a particular device selected by name and operating system version name: Google Pixel 2 XL os: "8.0.0" - device: # a random device for a particular operating system version os: "7.0"
To run this file, we will need to have Taurus installed and the bzt-apptim plugin for Taurus. To get early access to this plugin (which hasn’t been released yet inside the Taurus project), you can contact the Apptim team at email@example.com.
Then, we will run the command,bzt mobile_test.yml, and the tests will be executed on the real devices specified using AWS Device Farm. Then, the results of the tests will be published inside the selected Blazemeter project.
To run native mobile tests, we can create a similar Taurus YAML Apptim script and specify the app in the YAML file, as shown below. In this case, we’re testing our ApptimDemoIOS mobile app and have created Appium tests that are contained in the demoapptim zip file.
execution: - executor: apptim app-file: ApptimDemoIOS.ipa # .apk file for Android or .ipa for iOS app. test-runner: testng test-file: demoapptim.zip
Once we run our GUI mobile performance tests using the Taurus bzt-apptim plugin, we will see our results inside BlazeMeter, as shown below.
Inside the BlazeMeter test results, a link to the Apptim report is included, which contains more details on test steps, video of the test, device logs and mobile performance stats, such as CPU, Memory and Power usage on each device, as well as Rendering Times and any Exception or Crashes that may have occurred.
And finally, if we want to test how the user experience on each platform (web desktop, web mobile and native mobile) is while we stress our backend, we can run all the tests at the same time. We will see how the concurrency of users affects the UX on different clients, all inside BlazeMeter.
To sum up
Performance is a key factor for mobile UX and in order to improve it, we need to have more visibility on what is happening both in the backend and frontend of our apps (web and native). We can achieve end-to-end performance visibility by integrating Taurus and Apptim along with BlazeMeter.
To learn more about this strategy for end-to-end mobile performance testing, check out this webinar which also includes a demo. You can also put your URL in the box below to start testing.