Performance testing is one of the more complex types of software testing. There are many areas where you can err and waste time without getting the insights you’re after.
Here are five best practices from our performance testing experts.
1 – Set Performance Goals as early as possible
Regardless of the metrics or KPIs you use – response time, throughput, or something else – try to set specific, measurable performance goals as early as you can. Not having goals early on means that your testing is a bit undirected.
2 – Have the right ratio between test and production hardware size
You can do a capacity test with less than production size hardware.
But prod should be no more than three times bigger than the biggest test you ran. It’s easy to miss things with limited hardware size.
3- Plan enough time for scripting
In the project budget, the hardest thing it is to estimate is how long it takes to get the scripts to work right.
You’re probably not going to bother to test the easy stuff. The things you’re going to test, are the complicated things – and they take longer to get scripted and to get working. Plan extra time for scripting.
4 – Run key tests twice
While you’re running your testing, if you’ve got a test that’s at a key spot at a phase spot, then run it twice.
You must understand what normal variations are. It’s easy to run a test once, reach a conclusion but then discover that results don’t repeat.
5 – Test with a full-size database
As tables get bigger, databases slow down.
Surprising things happen when a particular table gets one more row in it, and suddenly functional behavior changes. This can relate to the way that the database optimizes queries or a range of other reasons. You simply won’t detect such problems unless you’re running with a full-size database.