Performance Testing is a crucial part of any software application testing. It helps to test the application for speed, stability and scalability. This blog will touch upon the various stages of the Performance testing life cycle and the best practices for each of the stages, which would help testing professionals do design and deliver efficient test reports.
Test plan is one of the crucial steps of performance testing, for smooth transition of all performance testing activities, throughout the project life cycle. Deviation from test plan could lead to conflicts in deadlines and deliverables. In order to avoid such situations, it is important to have an effective test plan in place.
Test Design & Scripts Development:
- - Provide test schedule for smoke tests and baseline/benchmark tests in test document
- - Mention all statements which are derived out of assumptions
- - Get the test plan reviewed by senior management and approved by client before proceeding to testing
- - Set client expectations early on to avoid any confusion
Want to reduce scripting load? Listed below are few best practices for the same:
- - Acquire user account details with exact permission level as that of the end users, as testing with admin accounts or accounts with additional features, may create problems while validating the scripts with live accounts
- - Always secure the copy of initial/raw version of the script to refer back whenever needed.
- - Correlate all values which appear to be dynamic, like unix timestamps etc.
- - Have parameters for all user input data in the flow. It is recommended to have a .CSV file format
- - Declare URL and ThinkTime as global variables, to reduce scripting efforts, whenever there is a change in URL
- - Always implement context check and error handling for every page of the application
- - Validate scripts for multiple iterations and multiple user accounts.
- - Always follow naming conventions in scripts for better readability.
Now that we have seen how to develop a test plan and design efficient scripts, it is now time to execute them.
Final Delivery and Report Submission
- - Gather data requirements in advance and request expected number of accounts from the client.
- - Ensure sufficient privileges for the validated user accounts.
- - Use random ThinkTime in the script to emulate realistic end user behaviour
- - Disable logging during load tests to limit disk writes on load generators
- - Generate load from load generator machines whenever possible instead of using controller/master, as controller will collect results from load generators and render run-time data during test
- - Identify details like project name, number of virtual users and Date in the scenario names (Example:LoadTest01_ZenQ_50Users_01Jan14)
- - Validate load generators connectivity before starting the test
- - Conduct a smoke test before executing load tests to validate scripts for multiple users.
Now is the time to generate reports based on the scripts that you have run and then present to the client.
- - Save final scripts within a designated folder with additional back-up (VSS or SVN or Google Drive)
- - Folder structure for all project artefacts should be organized as below:
- -03_TestDocs could have test plan document, user flow documents and final test reports
- -01_RawResults could have raw results file. (Example: .lrr for loadrunner and .jtl file for jmeter.)
- -02_Reports could contain test reports of that particular test
- - In load test reports, add legends to graphs for enhanced readability.
- - Ensure that all the graphs show data points starting from zero and scale should be corresponding to the data collected.
We have seen the various stages of the Performance testing life cycle and the best practices for each of the stages.
Using the above best practices, we have been able to design efficient test cases and execute and deliver high quality test reports to the client.