tensorflow.py_function fails to temporarily switch to eager execution while in graph mode

613 views Asked by At

I'm not sure if this is a Tensorflow bug or my misunderstanding about what this function is supposed to do, but I can't get tf.py_function to return an EagerTensor while in graph mode. Consequently, calling .numpy() on the output of this function fails.

The issue can be reproduced using the exact example given in the official documentation (https://www.tensorflow.org/api_docs/python/tf/py_function):

import tensorflow as tf

tf.compat.v1.disable_eager_execution()

def log_huber(x, m):
  if tf.abs(x) <= m:
    return x**2
  else:
    return m**2 * (1 - 2 * tf.math.log(m) + tf.math.log(x**2))

x = tf.constant(1.0)
m = tf.constant(2.0)

with tf.GradientTape() as t:
  t.watch([x, m])
  y = tf.py_function(func=log_huber, inp=[x, m], Tout=tf.float32)

dy_dx = t.gradient(y, x)
assert dy_dx.numpy() == 2.0

This generates the following error:

Traceback (most recent call last):
  File "<input>", line 17, in <module>
  File "C:\Users\...\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\framework\ops.py", line 446, in __getattr__
    self.__getattribute__(name)
AttributeError: 'Tensor' object has no attribute 'numpy'

About version

I am running Python 3.8 and Tensorflow v2.9.1.

Any help would be greatly appreciated!

1

There are 1 answers

5
ClaudiaR On

Solution 1 (with eager execution):

In Tensorflow 2, eager execution should be enabled by default.

I reproduced the exact same code as the Tensorflow tutorial without any problem (the assertion does not generate errors). I used Colab under Tensorflow 2.8.2 and Python 3.7.13.

If you have problems you could try setting tf.config.run_functions_eagerly(True), but really it should work even without this stuff.

Solution 2 (without eager execution):

If you want to keep eager execution disabled, you can work with sessions (more info about sessions). Instead of calling .numpy() you should call .eval() on your Tensor and wrap everything in a session. That's it.

tf.compat.v1.disable_eager_execution()

def log_huber(x, m):
  if tf.abs(x) <= m:
    return x**2
  else:
    return m**2 * (1 - 2 * tf.math.log(m) + tf.math.log(x**2))

x = tf.constant(1.0)
m = tf.constant(2.0)

# Launch the graph in a session.
sess = tf.compat.v1.Session()

with tf.GradientTape() as t:
  t.watch([x, m])
  y = tf.py_function(func=log_huber, inp=[x, m], Tout=tf.float32)

with sess.as_default():
  dy_dx = t.gradient(y, x)
  assert dy_dx.eval() == 2.0
  print(dy_dx.eval())

sess.close()