Calculate loss in train epoch function

194 views Asked by At

in train_epoch function

we have three kinds of losses

  1. loss
  2. batch_loss
  3. train_loss

as I understand loss is a tensor, batch loss is the value of the tensor , train_loss is the accumulative value of the batch_loss this is ok for me.

my question is why AllenNLP considered the batch_loss in for batch and did not calculate the cumulative loss for batch_group?

Also I did not understand the need for batch_group inside epoch, and batch inside batch_group

this is my understanding we have epoch inside it we have batch_group inside batch_group we have batch the batch_loss is calculated for batch not for batch_group why?

1

There are 1 answers

4
petew On BEST ANSWER

my question is why AllenNLP considered the batch_loss in for batch and did not calculate the cumulative loss for batch_group?

This is actually a bug, so thanks for pointing that out! There is a PR open now to fix it: https://github.com/allenai/allennlp/pull/4706

Also I did not understand the need for batch_group inside epoch, and batch inside batch_group

batch_group always consists of just a single batch unless you're using num_gradient_accumulation_steps greater than 1, i.e. you're using gradient accumulation, which is a method for getting a larger effective batch size.

See https://medium.com/ai2-blog/tutorial-training-on-larger-batches-with-less-memory-in-allennlp-1cd2047d92ad, for example.