Start Testing

Close form

Please keep me informed about Radview news and offers

Please complete all fields.

Thank you for
downloading WebLOAD

Check Your Inbox For WebLOAD!
A mail is on its way to you
with the download link, as well
as installation instructions.

Request a Quote

Close form

Please keep me informed about Radview news and offers

Please complete all fields.

Thanks for your
interest in WebLOAD!

We will get back to you shortly.
Meanwhile, we invite you to
learn more about performance
testing on our RadView Blog

Free

Close form

Please keep me informed about Radview news and offers

Please complete all fields.

Thank you for
downloading WebLOAD

Check Your Inbox For WebLOAD!
A mail is on its way to you
with the download link, as well
as installation instructions.

Standard

Close form

Please keep me informed about Radview news and offers

Please complete all fields.

Thanks for your
interest in WebLOAD!

We will get back to you shortly.
Meanwhile, we invite you to
learn more about performance
testing on our RadView Blog

Premium

Close form

Please keep me informed about Radview news and offers

Please complete all fields.

Thanks for your
interest in WebLOAD!

We will get back to you shortly.
Meanwhile, we invite you to
learn more about performance
testing on our RadView Blog

7 Performance Testing Mistakes You Want to Avoid

performance testing mistakes to avoid
Performance testing is one of the last pre-production links in the development chain. This makes it tempting to cut corners, rush results, and just get testing underway so you can move forward to launch.

However, there’s no argument that performance testing is a highly-complex undertaking. Plan incorrectly, build scenarios that don’t simulate a real production environment accurately, or overload your load generators – and at best you’ve lost the time, money and resources involved in re-running the test. At worst, you haven’t realistically load tested your system – and your long-anticipated launch can turn into a disaster.

Here’s a summary of some of the most common performance testing mistakes you should avoid. The full details are in our free ebook (see link at end of article).

Not defining clear goals

Without clearly-defined, quantifiable testing goals, load testing is a guessing game, at best.

Load testing goals need to be clearly defined based on business requirements prior to running and measuring your test scenarios. Some common testing goals can be the number of users the system should be able to handle per scenario, the response time per activity or the throughput (volume of data) you expect the system to handle.

Not creating a realistic test environment

A real-life production environment has nearly endless components – servers, databases, hardware, 3rd party tools, integrations, background processes that run periodically and much more. Because of this, a key load testing challenge is simply building a test environment that simulates the actual production environment. Without investing time and thought in creating a realistic environment, you can waste massive efforts testing something that is not real. Read our blog post about building a performance test environment.

Cutting corners

It’s very tempting to compromise when building load test scenarios. Even when facing budget, resource or time constraints – be careful where you cut corners, so as not compromise the results of your load testing. Two common hazards you should avoid include the number of users and data randomization. If you need to test the system with 100K users, you cannot use 10K users instead. Similarly, if you create only 100 user profiles to simulate 10K user profiles, the system will never be under the same stress due to data caching.

Starting too large

The objective of load testing is to simulate a large number of users in a realistic environment. However, experienced load testers understand that to begin testing with a final load goal inevitably leads to failure. If you begin your testing with your final target scenario, such as 10,000 users from five locations, over three types of devices, and with 10 different usage scenarios –  you are bound to fail.  It will be next to impossible to isolate errors when they arise.

Instead, start with one user, one location, and one device. Create a testing scenario that grows gradually, and closely monitor for errors at each stage.

Overloading load generators

Load generator machines, while serving your testing goals, can also skew your test results. An overloaded load generator machine can create a situation where no load is generated at all, or load is generated but with skewed results. To detect whether load generators are overloaded, check machine resources such as CPU utilization, memory usage, context switches and transactions per second. Read the dedicated blog post about how to detect overloaded load generators.

Ignoring system errors

Performance metrics and response time are understandably the key focus in load testing. But some system faults manifest themselves through system errors that are not so obvious – rather than a crash or a drop in response time. To identify all system vulnerabilities related to load, pay attention to errors such as HTTP 500, wrong data and suspicious behavior even when response time seems perfect.

Not documenting test runs

Re-running scenarios and comparing results between runs are an integral part of load testing. But when done multiple times, while tweaking and adjusting different parameters, application versions and test settings – it can become a nightmare to track the changes made in each test execution. Make sure to document issues such as scenario objectives, exact settings of the system under test, test environment settings, results and conclusions of each scenario run.

 

To get the full details and tips on how to avoid these mistakes, read our ebook – 7 Load Testing Mistakes you Don’t Want to Make.

 

 

53 More articles by RadView