Select Page

Why automate and integrate software performance testing

Sooner or later, every software product must survive the toughest of tests: hordes of users eager for feedback. How to generate confidence that our application will be able to do so? Performance testing is crucial to ensure that an application will meet the operating requirements, to know what volume of users it will be able to attend or to estimate how much it will cost us to be successful.

Through some good performance testing processes We will be able to know the load capacity that the system can support, determine the limit from which it does not work correctly or measure the time it takes to recover after a server crash. Valuable information that we will analyze to identify possible problems or bottlenecks in the current system and propose mitigation alternatives, aligned with business objectives.

But this is not magic. At Panel we understand that the maximum profitability of Software Quality processes is achieved by playing as a team. This implies commitment and motivation of people (Culture), the integration of an SQA methodology in software development (Processes) and the alignment of the tools with our methodology and our culture (Tools).

We have already explained the benefits of aligning the software creation lifecycle, so let's detail why we should integrate and automate performance tests, in this case with apache jmeter as a practical case, an open-source application made in Java for the execution of load and performance tests. Jmeter is a very versatile tool thanks to the possibility of integration with numerous plugins, which allow us to perform from simple tests with several concurrent users to complex simulations, introducing waiting times between requests or staggered increases in concurrent users.

JMeter Threads for Load Test

JMeter Threads for Load Test

Among the Non-Functional Software Testing, that focus on "how the system works", we will be able to work with the three main types of performance tests:

    • Load tests - Evaluate the application's ability to support a given volume of requests. We will use a high volume of concurrent users, without waiting times between requests.
    • Stability tests : they evaluate the capacity of the server to respond to a low level of requests for long periods of time. We will rely on introducing waiting times between requests of minutes or hours.
    • Stress tests: they evaluate the behavior of the application under a very high load of requests until they cause the service to crash. To do this, we can increase the volume in a stepwise or linear way.

Although these performance tests are designed using the Jmeter graphical interface, we recommend that they be executed through the command line so that you can better use the resources of the machine from which the tests are run. Ideally, there should be a server (host) dedicated exclusively to running performance tests.

Returning to our vision of the life cycle in the creation of software, to facilitate the continuous integration of performance tests there is Plugin for the automatic execution of these tests from Jenkins (a common suspect?). Is he performance plug-in, to which we will only have to indicate the path of the file that we have generated with Jmeter and Jenkins will do the rest.

On a practical level, These are some of the execution policies that we apply:

    • We launch the stability tests for 8 hours, but looking for a schedule that prevents the routine load on the service to be tested from affecting the results, in our case, starting at 18:00 p.m.
    • We scheduled the load or stress tests at any time because they ran for about 5-10 minutes. Typically, we run them several times a day to identify if there are significant differences in the results obtained.
    • The performance tests ("performance") will be run individually if we want to evaluate the performance of each service in isolation. If we want to abuse, Jenkins allows us to configure several nodes to perform parallel executions.

Toca analyze the results obtained, which we usually consolidate in performance reports. The report is generated thanks to Blazemeter, which presents us with a visual interface where we can find and extract various graphs and data tables that allow us to work on the detail of the performance tests carried out. Some interesting parameters to analyze are:

    • Number of requests that did not receive a response (% of errors)
    • Response time (Time that elapses since the request is made until the response is received)
    • Number of requests per second (throughput)
JMeter - report with BlazeMeter

JMeter - report with BlazeMeter

 

And what have we achieved?

The execution and subsequent analysis of the performance tests served to verify if the platform services meet the requirements for response time, number of requests per second or percentage of errors. Thus, once these data have been quantified, a report is issued with recommendations, warning of risks and proposals for improvement.

In our case, when using a Kubernetes-based architecture, there is the possibility of scaling up services. In other words, if a greater capacity is required for a specific service, the number of replicas of the container that hosts said service can be increased (pods).

Thanks to the data obtained from the performance tests, scalability was confirmed, so that for each added replica of a pod, the service decreased its response time and increased the number of requests it could handle per second proportionally:

    • With 2 replicas the response time is reduced by half and the number of requests per second (throughput) is doubled.
    • With 3 replications the response time is reduced to a third and the throughput it is triple.

On the other hand, with the presence of at least a second replica in critical platform services, High availability was verified to be guaranteed with a drastic reduction in the number of errors.

We have verified that we have a highly available and scalable application,
What more can we ask for? ... Keep it up ...
Relax, our continuous integration process is at hand.
A very comforting and recommended experience.

If you are interested in taking your first steps in performance tests, you can start here guide to develop a web load test. We also include another guide to unraveling the mysteries of charging reports with BlazeMeter, and so you are already underway ...

Charge!

 

Jesus barrero

Jesus barrero

Jesus is QA Tester at Panel Sistemas. You can visit his profile at Analysis, or contact him via e-mail in this direction.

Leave us your comment

0 comments

Send a comment

Your email address will not be published. Required fields are marked with *

Share This