I need to calculate the gradient of a tensorflow that is stored. I can restore the graph and weights using:
model1 = tf.train.import_meta_graph("models/model.meta") model1.restore(sess, tf.train.latest_checkpoint("models/")) sess.run(tf.global_variables_initializer()) graph = tf.get_default_graph() weights = graph.get_tensor_by_name("weights:0") biases = graph.get_tensor_by_name("biases:0")
I have also named my loss function in the original function so I can restore it with
loss = graph.get_operation_by_name("loss") # for operation loss = graph.get_tensor_by_name("loss:0") # for the tensor
Basically, I want to get the gradient of the loss with a certain input value using tf.gradients(...). My loss is specifically the
nce_loss https://www.tensorflow.org/api_docs/python/tf/nn/nce_loss. I want the gradient of the loss given the inputs function. Specifically, I plug in a new embedding and I want the gradient given that new input and the loss function. However I can't seem to define my input successfully. If I use:
grads = tf.gradients(loss, loss.inputs) #here I use the tensor loss definition
ValueError: Name 'loss:0' appears to refer to a Tensor, not a Operation.
How do I define my gradient here?