How to get learning rate of AdamW optimizer (using multioptimizer)

1.2k views Asked by At

I am using AdamW optimizer with two different learning rates: One for pre-trained layer and the other for custom layer

import tensorflow_addons as tfa
lr = 1e-3
wd = 1e-4 * lr
optimizers = [
tfa.optimizers.AdamW(learning_rate=pre_trained_layer_lr , weight_decay=wd),
tfa.optimizers.AdamW(learning_rate=lr, weight_decay=wd)
    ]
optimizers_and_layers = [(optimizers[0], base_model.layers[0]), (optimizers[1], 
                        base_model.layers[1:])]

optimizer = tfa.optimizers.MultiOptimizer(optimizers_and_layers)

Now I want to visualize this learning rate during model training. Below is the code that I am using

from keras import backend as K
from keras.callbacks import TensorBoard

class LRTensorBoard(TensorBoard):
    # add other arguments to __init__ if you need
    def __init__(self, log_dir, **kwargs):
        super().__init__(log_dir=log_dir, **kwargs)

    def on_epoch_end(self, epoch, logs=None):
        logs = logs or {}
        logs.update({'lr': K.eval(self.model.optimizer.lr)})
        super().on_epoch_end(epoch, logs)


#Using the code class in model.fit
model.fit(...., callbacks = [LRTensorBoard(path)])

But I didn't find model.optimizer.lr as this variable is not present in optimizer mentioned above. I found some information related to the optimizer using

model.optimizer.optimizer_specs[0]

But I am not able to find different learning rates associated with this optimizer.

How to get the learning rate for pre-trained layer and custom layer using AdamW optimizer?

1

There are 1 answers

0
Harry Potter On

model.optimizer.optimizer_specs is a list of dictionaries containing infos for each of your optmizers. You can access your first optimizer object by model.optimizer.optimizer_specs[0]['optimizer']. This way, you can also access the learning rate by model.optimizer.optimizer_specs[0]['optimizer'].lr.