Under the average website load scenario, numerous users use their browsers to surf certain websites. A single web page presented in a browser can generate tens, sometimes hundreds of unique HTTP requests. The browser, after receiving all of the responses renders the page for the user to view.
Every browser (IE, Firefox, Chrome, etc.) has its own way of generating the HTTP request according to the viewed web page. In a load scenario, the overall number of HTTP requests is directly related to the number of users that are surfing concurrently. All requests hit the server during the test generating a response for each request.
To better understand overall system performance, consider the following three metrics:
Perceived system performance
|System performance as perceived by the load testing servers. The load testing servers measure numerous metrics related to all generated requests and responses received.|
Perceived user experience
|Page load time as perceived by a real browser. This metric represents the user experience. In particular the time it takes the browser to load a certain page during the test.|
|The system traditional KPIs such as CPU, memory, bandwidth, etc as measured during the test.|
Each metric is a combination of numerous measures such as:
|Response Time||The time it takes a request to fully load. From the time the request is initiated until the time it is complete. This generally indicates the performance level of the entire system under test (web server + DB). This measure represents the average response time at a certain minute of the test.|
|Latency||Time until first response. The time it takes for the first byte to be received as part of the response. This generally indicates the performance level of the web server. This measurement represents the average latency time at a certain minute of the test.|
|Users||This measurement represents the number of active users at a certain minute of the test.|
|Hits||This measure represents the number of hits per minute at a certain minute of the test.|
|Errors||Errors generated by the server during the test and errors due to connection timeouts, refusals or broken connections.|
|Bandwidth||The amount of bandwidth used by a request or set of requests.|
Perceived System Performance
Perceived system performance is the most important metric in performance testing as it contains all three metrics mentioned above. This metric provides answers to the following questions:
- Q. Does the system behave differently under different load scenarios? For example: Is the level of performance the same when 10 Qusers are visiting the site and when 1000 Qusers are visiting the site?
- Q. Is there a degradation in performance level for each unique request under different load scenarios? For example: POST requests, DB transactions etc.
- Q. When does performance degradation begin?
- Q. Where are the bottlenecks in the system under test?
Perceived system performance applies three different measurements to each unique request that is part of the simulated traffic and three different measurements for the aggregated results of the simulated traffic. These measures allow a load-testing professional to evaluate the performance of each request under a certain load.
Perceived system performance provides aggregated reports as well, taking into account all request and responses. The aggregated reports can provide insight while identifying bottlenecks. Some examples of conclusions that can be reached via perceived system performance:
- Performance results can state that an average response time of a website is 700 milliseconds. However, some POST requests response time can grow with the growth of the number of users. One can only identify this by looking at the specific reports of each request.
- Identify that a DB request is taking too long to execute under a load scenario.
- Discover which CSS pages break under a load.
- Connection timeouts and broken connections.
- Error responses generated under load.
Perceived User Experience
Perceived user experience provides an answer to one of the most important questions:
- Q: What would be the user experience under a certain load scenario?
As each brand of browser generates the HTTP requests in a different way, the above mentioned metric (Perceived system performance) can not tell us what would be the perceived user experience.
Consider the example where a page generates 10 HTTP requests. Assume each HTTP request takes 1 second to load, what would be the load time of the full page?
It's hard to say. Some browsers will execute all requests in parallel while some will execute one after the other. So it can be anywhere from 1 to 10 seconds. The only way to measure the user experience is by launching a real browser and measure the load time of the web page.
Using the same technique under a load scenario can assist in an evaluation of the user experience during the load.
With the perceived user experience, a website owner can know what the user experience would be under some load scenarios.
- A web page load time can be 2 seconds at 100 users and 4 seconds at 500 users.
- The website will not load at all at 1000 users.
System performance completes the performance picture by describing the system performance using traditional KPIs such as:
Correlating perceived system performance and system performance results can assist with the identification of bottlenecks and problems that are responsible to a poor performance level.
For example: If under a certain load the CPU level of the server under test goes over 70%, we know that the server is not capable dealing with such a load. That said, the response time gathered from perceived system performance, can tell us about the same thing.
Load results, user experience and system performance - all three of these metrics are required to get the full system performance picture. The most important metric is the perceived system performance, as it simulates the numerous users that will visit the website during a load scenario and describes in details all of the measure accumulated during that time.
You might also find these useful:
Interested in writing for our Blog? Send us a pitch!