I've been reading through the Ternsorflow documentation and have come to a section describing some of the differences between eager and graph execution. Graph execution is defined (described?) as being non-strict, in that
"Graph execution only executes the operations necessary to produce the observable effects, which include:
The return value of the function Documented well-known side-effects such as: Input/output operations, like tf.print Debugging operations, such as the assert functions in tf.debugging Mutations of tf.Variable
This behavior is usually known as "Non-strict execution", and differs from eager execution, which steps through all of the program operations, needed or not."
An example given is how an "unneeded" operation will not throw an error:
@tf.function
def unused_return_graph(x):
tf.gather(x, [20]) # unused
return x
# Only needed operations are run during graph execution. The error is not raised.
print(unused_return_graph(tf.constant([5.0])))
Returns:
tf.Tensor([5.], shape=(1,), dtype=float32)
But,
#@tf.function
def unused_return_graph(x):
tf.gather(x, [20]) # unused
return x
# Only needed operations are run during graph execution. The error is not raised.
print(unused_return_graph(tf.constant([5.0])))
Returns:
InvalidArgumentError: {{function_node __wrapped__GatherV2_device_/job:localhost/replica:0/task:0/device:CPU:0}} indices[0] = 20 is not in [0, 1) [Op:GatherV2]
tf.gather(x, [20])
looks to me like it is needed to produce "the return value of a function", which in this case would be undefined. Why is this not so? What would be a case where tf.gather was deemed necessary? I suspect if I understood Denotational Semantics, I wouldn't have this question, but I'm having trouble slogging through it all and am hoping a succinct example might make things more clear.
Later thought: Could "unused" mean no variable values were changed or created?