Is there a way to monitor optimizer's step in Pytorch?

1k views Asked by At

Consider that you are using a Pytorch optimizer such as torch.optim.Adam(model_parameters).

So in your training loop you will have something like:

optimizer = torch.optim.Adam(model_parameters)
# put the training loop here

loss.backward()
optimizer.step()
optimizer.zero()

Is there a way to monitor what steps are taking your optimizer ? To make sure that you are not on a flat area and thus taking no steps since the gradient are null. Maybe checking the learning rate would be a solution ?

1

There are 1 answers

0
White On BEST ANSWER

Answering to myself here.

The best practice would be (in PyTorch) to check the gradient of the leaf tensors. If the gradient is None but is_leaf attribute is set to True, something is obviously buggy.

torch.nn.Parameters('insert your tensor here') is especially confusing to this regard. As the Tensor needs to be defined as a torch.nn.Parameters to be succesfully updated. I would advise not to use tensor.requires_grad_(True) as is confuses torch. Only set your parameters as above.