In the days of AGILE, where quick application development, test, fix and releases are happening sharing a static report for a performance test is not the norm. A more and modern approach to what to be reported has taken shape. The project managers and business sponsors need more than a report to be presented. They need a thorough analysis and conclusions based on those. They need information on what has gone wrong and which component to be looked into to resolve the issue noticed. They need data to back the test findings and they need total value out of the tests. At the same time there are other stakeholders involved in the test. These are the tech team members who play a very critical role in further fine tuning the app. To satiate them, one needs to present them with not just the response time numbers and the transaction failure rates, but also trends of those over earlier releases, individual data points associated with those, failure points, data associated with failures, information on activities happening on the servers during the failures. In this article, I am trying to highlight what in performance test result report matters and how is that helpful for the business, the tech folks and the customers.
Transaction related items:
For each transaction following items should be included in the result report :
Comparison over earlier tests (for earlier releases or same release):
Client side counters: Following are client-side graphs, if included, help in analysis –
Server related counters:
Things that matter to business:
This post is also available in: French