TensorFlow: getting gradients batch-norm and bias

191 views Asked by At

I am training a NN implemented in TensorFlow Keras.

    model.fit(..., callbacks=[..., CustomCallback()])

My objective - during the training get:

  1. Gradients: params - the gradients used to update the weights, and layer propagated gradients.
  2. Bias
  3. BatchNorm: running mean, running var, Gamma, Beta
  4. Getting the input and target

I am using keras.callbacks.Callback

class CustomCallback(keras.callbacks.Callback):
    ...
    def on_train_batch_end(self, batch, logs=None):
    # 1.) Get gradients
    # 2.) Get bias
    # 3.) Get BN: running mean, running var, Gamma, Beta
    # 4.) Input and target

I found how to get weights from self.model. How do I get all the tensors I have mentioned?

1

There are 1 answers

0
David On BEST ANSWER

You can get all tensors values in the following manner:

class CustomCallback(keras.callbacks.Callback):
...
def on_train_batch_end(self, batch, logs=None):
    # 1.) Append gradients tensor names to fetches_names
    # 2.) Append bias tensors names to fetches_names
    # 3.) Append running mean, running var, Gamma, Beta to fetches_names
    output = session.run(fetches_names, feeds)

fetches_names - a list of strings, include names of tensors we wish to read Reading only the gradients may be done by calling to the gradients function form Keras backend:

from keras import backend as K
grads = K.gradients(loss, input_img)

The thing with this function is that it edits the graph.