pytorch: how to change batchsize during training?

280 views Asked by At

I want to change batchsize during traing loop. I have tried re-instantiate a new dataloader so that i change batchsize by change 'batchsize' parameter. But this will spend some time.

How can I change batchsize like below:

    for batch, (X, y) in enumerate(dataloader):
        # do sth here like traing model
        dataloader.setBatchsize(newBatchsize) #this what i want to do

in this paper: Semi-Dynamic Load Balancing: Efficient Distributed Learning in Non-Dedicated Environments ,the author say that he change batchsize by custom DataIter and BatchSampler, but I have no idea how to do that.

Hope that colleagues and seniors can give some guidance

0

There are 0 answers