How does Niftynet handle multiple-gpu training

99 views Asked by At

I'm using Niftynet to train a CNN using 2 GPUs. As I understand, each GPU is trained independently as I get two loss values per iteration. Are the results of both GPUs combined at inference time? I used to believe that using multiple-gpus reduces the training time but in Niftynet it doesn't seem to be the case.

1

There are 1 answers

0
manza On

Yes, correct. It does reduce training time at my case. Notice, that the batch size doubles by using multiple GPUs.

For example, if your batch size = 2, after using multiple gpu it means that every gpu will have batch size = 2. So your final batch size will be 4.