How to calculate gradients on tensorflow_probability layers?

169 views Asked by At

I would like to calculate the gradients on tensorflow_probability layers using tf.GradientTape(). This is rather simple using a normal, e.g., Dense layer

inp = tf.random.normal((2,5))
layer = tf.keras.layers.Dense(10)

with tf.GradientTape() as tape:
    out = layer(inp)
    loss = tf.reduce_mean(1-out)
grads = tape.gradient(loss, layer.trainable_variables)
print(grads)
[<tf.Tensor: shape=(5, 10), dtype=float32, numpy=
 array([[ 0.04086879,  0.04086879, -0.02974391,  0.04086879,  0.04086879,
          0.04086879, -0.02974391,  0.04086879, -0.02974391, -0.07061271],
        [ 0.01167339,  0.01167339, -0.02681615,  0.01167339,  0.01167339,
          0.01167339, -0.02681615,  0.01167339, -0.02681615, -0.03848954],
        [ 0.00476769,  0.00476769, -0.00492069,  0.00476769,  0.00476769,
          0.00476769, -0.00492069,  0.00476769, -0.00492069, -0.00968838],
        [-0.00462376, -0.00462376,  0.05914849, -0.00462376, -0.00462376,
         -0.00462376,  0.05914849, -0.00462376,  0.05914849,  0.06377225],
        [-0.11682947, -0.11682947, -0.06357963, -0.11682947, -0.11682947,
         -0.11682947, -0.06357963, -0.11682947, -0.06357963,  0.05324984]],
       dtype=float32)>,
 <tf.Tensor: shape=(10,), dtype=float32, numpy=
 array([-0.05, -0.05, -0.1 , -0.05, -0.05, -0.05, -0.1 , -0.05, -0.1 ,
        -0.05], dtype=float32)>]

But if I do this using DenseReparameterization, the grads register None.

inp = tf.random.normal((2,5))
layer = tfp.layers.DenseReparameterization(10)

with tf.GradientTape() as tape:
    out = layer(inp)
    loss = tf.reduce_mean(1-out)
grads = tape.gradient(loss, layer.trainable_variables)
print(grads)
[None, None, None]

Can anyone tell me how to fix this issue such that the gradients are taped and register?

1

There are 1 answers

0
jlapin On

Aha, that's it! I am using tf v2.1.0. Apparently that does not work well with tensorflow_probability. I will upgrade asap. Thank you gobrewers14.