Jmeter deviation vs throughput

25.7k views Asked by At

can you give some.explanation on how to interpret deviation vs.throughput? Is 1000++ deviation result means a poor performance of web under test? And how can you also say that the web under test performs good? Is it base on.throughput result? How?

And what listener is the best for tracing the load/performance of a thousand users

Lastly is that possible to check the cpu/ram usage of the server while pwrforming the test

1

There are 1 answers

4
Zubair M Hamdani On BEST ANSWER
  • Standard deviation quantifies how much response time varies around its mean, or average. It is not advisable to judge the system performance based on Standard Deviation. In reality this gives how much system is fluctuating. Deviations should be minimum i.e. less than 5%.

  • Thoughput is defined as number of requests processed per second.

  • It is better to use Throughput as a factor to judge system/application performance. Higher throughput means good system performance but again this depends on your choice. For some critical systems low response time is more required than high throughput. Throughput simply states that how much concurrent transactions your system can process per second which might mean high response time. if response time increases beyond a certain limit then that system is considered for performance tuning.

some systems Throughput means

  • You can either use Summary Report or Aggregate Report listeners.

  • cpu/ram usage, you can use "jp@gc - Perfmon Metrics collector" from Jmeter Plugin.

Hope this will help.