Become a JMeter and Continuous Testing Pro

Start Learning
Slack

Test Your Website Performance NOW!

arrow Please enter a URL with http(s)
Dec 08 2014

Interview with Michael Bolton: The Software Tester & the Unexpected, Part 1

Michael Bolton is a thought leader in the world of software testing, with over two decades of experience in the computer industry testing, developing, managing, and writing about software. As an international consultant and testing trainer, Michael gives workshops and conference presentations about testing methodologies and perceptions that specialize in Rapid and Exploratory Software Testing. Learn more  

 

We invited cloud blogger Ofir Nachmani to interview Michael Bolton. See what unfolded below:

 

Ofir Nachmani: “My interview with Bolton was really unique. We started with a shallow discussion about the cloud, that quickly moved to a very interesting discussion about software testing. Michael provided enlightening insights into the role of an individual software tester, the outlook a tester should have when using load testing tools, and the relationship between a product owner and a tester. He also covered basic fundamentals, such as how to test “the unexpected”. Even if you’re not a developer or tester, the information below will most likely shed light on an industry that directly affects us all.”

 

ON: As someone who instructs and consults testers all over the world, what are the issues and challenges that people in the industry face today, in particular when it comes to performance testing?

 

MB: I observe that organizations tend to put an emphasis on testing as a verification of things that they hope to be true. That’s okay as far as it goes, but it doesn’t go very far, alas.  For example, many people start to develop their performance testing strategy by setting performance criteria that they believe the system must meet. However, you don’t need performance criteria to do performance testing. In fact, I’d worry about testers doing that, because of the risk of shifting focus to confirming a certain set of ideas, rather than to investigating a product and the risks surrounding it. It’s crucial to not only know but also to develop our ideas about important questions that need to be asked about performance. Many of those questions are not obvious or apparent at the beginning of a project.

 

In my view, demonstrating conformance with prescribed criteria is the least interesting and least important part of performance testing. The more important goals are to describe a system as it actually behaves, and to anticipate, investigate and discover performance-related problems. Focusing on an expectation or a desire―let’s say “ten thousand transactions a minute”―sets a pretty narrow scope for investigation. It leaves out the kinds of problems that we could encounter with individual transactions within that ten thousand; it leaves out looking for factors that might contribute to slowdowns; or it steers us away from considering factors that might influence a decision on when to optimize the code or to throw more hardware at the problem. It encourages us to count, instead of studying and describing the system. A conformance focus tends towards confirming answers to existing questions, instead of raising new questions.

 

ON: What factors should a tester keep in mind when load testing a website or application?

 


MB: To me, the mindset of a tester should be oriented towards identifying problems that threaten the value of a product. If you’re using performance testing tools, use them to learn, to help look for problems that weren’t anticipated in the first place. Complex systems have the capacity to surprise us. Great tools help to visualize what’s happening, to highlight patterns and inconsistencies, to identify stress points and choke points. Next to that, prior expectations and predictions aren’t that big a deal.

 

What a system actually does is far more important than what your expectation is. You might have a working hypothesis about the system as you design experiments, but the hypothesis probably isn't that interesting compared to what you actually discover in the course of performing and analyzing that experiment.

 

To me, excellent performance testing isn’t about showing that the system can achieve some specified transaction rate―that’s demonstration that the product can work. Fabulous performance testing is about discovery---finding where the slow and problematic bits are, where the bottlenecks are, and what can interfere with successful transactions when we put a system under load or stress, or when we run it for a long time with lots of variation. I’d like my tools and my models to help me to develop and illustrate a comprehensive understanding of a product, and identify what threatens value and success. Part of that involves recognizing that there are different dimensions of success.

 

Stay tuned for part 2 of this interview to learn about the tester's place in the software industry, helpful tools, the value of numbers, and the unexpected.

 

 

Want help with your performance testing? Check out BlazeMeter performance testing features


Check out Michael’s software testing blog

     
arrow Please enter a URL with http(s)

You might also find these useful:

Interested in writing for our Blog?Send us a pitch!