How is throughput calculated when using JMeter slaves?

504 views Asked by At

Given I have a JMeter test script that would have a constant throughput of 200 transactions (running threads) per minute, and I have two slaves controlled by the JMeter master that would execute that script, would the resulting throughput be doubled or would JMeter share the load between the slaves, resulting in still 200TPM?

Cheers, Kai

2

There are 2 answers

1
Kai On BEST ANSWER

I found the answer on http://jmeter.apache.org/usermanual/remote-test.html:

Note: The same test plan is run by all the servers. JMeter does not distribute the load between servers, each runs the full test plan. So if you set 1000 Threads and have 6 JMeter server, you end up injecting 6000 Threads.

0
Dmitri T On

JMeter slaves are totally independent beasts hence they don't know anything about each other so both nodes will produce 200 TMP therefore you will get 400 in total. Adding extra node will add extra 200 TMP.

Despite its name Constant Throughput Timer doesn't have to be "constant", you can define throughput default value using __P() function like ${__P(TPS,200)} and modify it either when you start the test via -G command-line option like:

jmeter -GTPS=100 -n -r -t ... 

Or even do it while your test is running using Beanshell Server.

See Apache JMeter Properties Customization Guide to learn more about JMeter Properties concept.