Directly update the optimizer learning rate

1.7k views Asked by At

I have a specific learning rate schedule in mind. It is based on the epoch but differs from the generally available ones I am aware of including StepLR.

Is there something that would perform the equivalent to:

optimizer.set_lr(lr)

or

optimizer.set_param('lr,',lr)

I would then simply invoke that method at the end of each epoch (or possibly even more frequently)

Context: I am using the adam optimizer as so:

    optimizer = torch.optim.Adam(model.parameters(), lr=LrMax, weight_decay=decay) # , betas=(args.beta1, args.beta2)

Update I found this information https://discuss.pytorch.org/t/change-learning-rate-in-pytorch/14653:

for param_group in optimizer.param_groups:
        param_group['lr'] = lr

Is there a way to ascertain that the adam optimizer being used is employing the new learning rate?

1

There are 1 answers

1
Sergii Dymchenko On BEST ANSWER

You can do this in this way:

for param_group in optimizer.param_groups:
    param_group['lr'] = lr