Start Testing

Close form

Please keep me informed about Radview news and offers

Please complete all fields.

Thank you for
downloading WebLOAD

Check Your Inbox For WebLOAD!
A mail is on its way to you
with the download link, as well
as installation instructions.

Request a Quote

Close form

Please keep me informed about Radview news and offers

Please complete all fields.

Thanks for your
interest in WebLOAD!

We will get back to you shortly.
Meanwhile, we invite you to
learn more about performance
testing on our RadView Blog

Free

Close form

Please keep me informed about Radview news and offers

Please complete all fields.

Thank you for
downloading WebLOAD

Check Your Inbox For WebLOAD!
A mail is on its way to you
with the download link, as well
as installation instructions.

Standard

Close form

Please keep me informed about Radview news and offers

Please complete all fields.

Thanks for your
interest in WebLOAD!

We will get back to you shortly.
Meanwhile, we invite you to
learn more about performance
testing on our RadView Blog

Premium

Close form

Please keep me informed about Radview news and offers

Please complete all fields.

Thanks for your
interest in WebLOAD!

We will get back to you shortly.
Meanwhile, we invite you to
learn more about performance
testing on our RadView Blog

4 Ideas for Getting Started with DevOps Performance Testing

Published by David Buch on May 17, 2016

If you work at a Fortune 2000 company, there are good chances you’ve already discovered the benefits of DevOps. At leasts that’s what Gartner and IDC argue.

But while the majority of DevOps automation include development, deployment, and some regression testing, much less companies have automated performance testing. My guess is that performance testing is not always straight forward and may be a bit more challenging to automate.  For instance, building a solid performance test environment and test data set is a task in itself. It becomes even more difficult if the test environment must support an agile process with frequent rounds of testing.

DevOps

To help you with the automation of performance testing, here are 4 tips that will get you started.

 

1 – Identify the performance tests that should be automated

Let’s begin by stating that in general, technology can often end up being a barrier to getting started with DevOps, as well as performance testing. Endless technologies, protocols, tools and integrations. Put all these issues aside and start by focusing on the end game result. Namely, which of your performance tests are the best candidates for automation? Specifically, which of your tests are critical towards the validation or invalidation of a build/release in the context of performance?

For example, AVG, which provides antivirus software and internet security services, uses WebLOAD to run load tests that verify stability and response times of their revenue critical pages. While performance tests cover other aspects too, it’s the response time of business-critical areas that were automated and are run regularly in connection with the software releases.

At Ellucian, an educational ERP software company, the performance testing group has 150-200 load test use cases, which are run on a daily basis (with ~3500 concurrent users) to validate multiple product lines. However, it’s the tests that validate the scalability of its production infrastructure as a service provider, that have been integrated into DevOps processes and automatically orchestrate tests against every build in the pipeline.

 

2 – Save a Baseline

In the context of DevOps as well as Continuous Integration (CI) and Continuous Delivery (CI), you’ll often want to compare a new software build to some historical baseline measurement. The baseline may be a ‘hard-coded’ set of results from the past or simply the performance stats of the current live production version.

If you detect a degradation in performance relative to the baseline, you can generate alarms and investigate the problem. Alternatively, you may discover that some tuning changes had a positive impact on performance. Regardless, a baseline will help you understand empirically the results of each test run.

 

3 – Define clear success vs. failure criteria

Before you can automating anything, you need clear pass/fail definitions for each of your tests.

Here are some testing goals you might consider:

  • Response time – a maximum response time per transaction, or an average response time for multiple transactions above which a test will be defined as failed.
  • Error rate – an acceptable error rate threshold for transactions.
  • Hits, or requests per second – a reduction in the hits per second points out that the server can handle less requests, which you want to investigate.
  • Average throughput -the average number of bytes that virtual users receive from the server at any given second can help you spot server issues that result in sending more data than needed..
  • Server CPU or memory – ensuring that CPU or memory does not go up above a certain threshold helps avoid potential crashes and a slow response.

 

4 – Agree on team roles and processes for results analysis

By integrating performance testing into DevOps you’re expanding the involvement of additional groups and individuals, which previously were not part of load testing. For example, if a software build is not pushed to the live production system because of a performance test failure, you can be sure that stakeholders outside the performance test group will be interested in the details.

This means you should define the processes, tools and mechanisms by which multiple groups can have access to test results, received reports and possibly drill down to view details. Performance tests involve loads of data, so you want define clear procedures for how to provide access and distribute information.

webload dashboard

The WebLOAD Web Dashboard above allows multiple stakeholders to share predefined results views with warning about performance thresholds.

 

 

21 More articles by David Buch

Dudu has led R&D teams in several high tech companies. Prior to RadView, Dudu was VP R&D at Softlib and Brightinfo, R&D Manager at HP Software, Director of R&D at Mercury Interactive. Dudu is a Bar Ilan University BA magna cum laude graduate in computer science and economics and is a MAMRAM (The Israeli Army Computer Corps) graduate.