Get current learning rate when using ReduceLROnPlateau

1.1k views Asked by At

I am using ReduceLROnPlateau to modify the learning rate during training of a PyTorch mode. ReduceLROnPlateau does not inherit from LRScheduler and does not implement the get_last_lr method which is PyTorch's recommended way of getting the current learning rate when using a learning rate scheduler.

How can I get the learning rate when using ReduceLROnPlateau?

Currently I am doing the following but am not sure if this is rigorous and correct:

lr = optimizer.state_dict()["param_groups"][0]["lr"]
1

There are 1 answers

0
Shai On

You can skip the state_dict of the optimizer and access the learning rate directly:

optimizer.param_groups[0]["lr"]