Mini-batch size during prediction

453 views Asked by At

I am using lstm model. I understand what mini-batch size means with respect to training the model. Basically it is related to updating the gradient in a batch rather than after every sample. But what does mini-batch size means during prediction phase. I can't understand the role of batch size during prediction phase. Can changing it impact my results?

2

There are 2 answers

0
Dr. Snoopy On

The concept of batch is more general than just computing gradients. Most neural network frameworks allow you to input a batch of images to your network, and they do this because it is more efficient and easily parallelizable to GPUs.

Increasing or decreasing the batch size for prediction generally only affects the computational efficiency, not the results. Only in the case of a stateful model such an LSTM with states (not the normal LSTM), you would get results that change with the batch size.

0
kerastf On

Batch Size etc are only related to Learning.After your model has learned(Trained) it will just save the weights.While testing or predicting it will just use the saved Weights to make the prediction.

By default a vanilla LSTM resets the cell states after a batch size but you can change that.You can make it to update states after an epoch or even maintain all states.