Guest Post: The Easiest Way to Validate Cloud Services at Scale
This is a guest blog post by Micah Knox, Software Architect and CTO of Employee Assessment Delivery Platform SonicAssess, a new service designed to assess the talents and leadership capabilities of employment candidates. To ensure the platform would be able to cope with extremely high traffic loads, Micah validated its performance by running tests before its release.
For years, my father and I ran a custom software development company before we decided to transition into cloud-based services. Unlike a bespoke enterprise solution with known usage patterns, a key requirement of any cloud-based solution is extreme scalability.
Enabling 750 Concurrent Requests Per Second
We founded SonicAssess to help large restaurant chains, retailers and hospitality companies efficiently manage staffing issues in an industry known for high turnover rates. In order to provide an objective, third party assessment of a candidate’s talents, potential for leadership, character, depth of thinking and more, we designed a highly configurable platform that allows clients to easily customize the evaluation process, create multiple assessment programs and brand the service for their unique needs, goals and philosophies.
Once we were satisfied with the platform’s features, we turned to the technical aspects of delivering lightning-fast service via the cloud. Determining usage and performance parameters for an emerging cloud service can be tricky. We knew that every new organization we onboarded could possibly use the service to manage assessments for thousands of locations – potentially processing hundreds of applicants every day. After much research, we decided to specify Day 1 capacity for 300 concurrent users, creating a total of 750 concurrent requests per second.
We knew these estimates were high. However, we were designing SonicAssess to consistently deliver a solid experience under very large workloads. Anything less would hurt the brand, disappoint clients and discourage quality candidates from applying.
How Testing Improved the Performance
Assuring performance throughout the development cycle testing was critical to our plan. In the past, JMeter had always seemed to be the most scalable option for load testing but the BlazeMeter implementation simplified the process immensely. Its effective and comprehensive onboarding program helped us quickly get up to speed and rapidly build test plans that fully exercised how well the infrastructure supported the SonicAssess workloads.
By testing throughout the development cycle, we found bottlenecks in places that we didn’t expect to have problems. As we stabilized the solution, we were pleased to see that the response time for key business processes dropped to an average of 800 milliseconds – going as low as 300 milliseconds for complex search transactions – during extreme usage conditions. Full scale load testing also helped us ensure maximum availability – in fact our 90-day uptime metric is 99.999%.
Validating Performance Prior to Release
Entering a new market as a startup can be extremely challenging. It is one thing to say you are delivering radical speed, configurability, usability and security – but it’s quite another thing to prove it. No matter how strong your architecture design may seem on paper, the only way to ensure the real world performance of any cloud service is to see how the applications stand up to extreme usage. The fact that we can empirically demonstrate to our clients our commitment to quality and our dedication to providing a high tech solution to a very real problem should prove to be an important differentiator over time.
BlazeMeter was instrumental in helping us create a market leading solution capable of efficiently supporting large-scale assessment programs.
Need help getting started with your performance testing? Check out our online JMeter Training Course