I have followed the translation colab notebook tutorial as suggested by Google's tensor2tensor repository
After exporting the model and uploading it to Google's AI Platform engine for online prediction, I am having trouble making requests to the model.
I believe the input to the translation model is a tensor of the source text. But I am receiving an error that TypeError: Object of type 'EagerTensor' is not JSON serializable
def encode(input_str, output_str=None):
"""Input str to features dict, ready for inference"""
inputs = encoders["inputs"].encode(input_str) + [1] # add EOS id
batch_inputs = tf.reshape(inputs, [1, -1, 1]) # Make it 3D.
return {"inputs": batch_inputs}
enfr_problem = problems.problem(PROBLEM)
encoders = enfr_problem.feature_encoders(DATA_DIR)
encoded_inputs = encode("Some text")
model_output = predict_json('project_name','model_name', encoded_inputs,'version_1')["outputs"]
I've tried converting the tensor to numpy but still no luck. Could someone point me in the right direction?
The problem is that TensorFlow returns an EagerTensor when you do:
And EagerTensor cannot be converted to JSON. Unfortunately an 3D numpy array cannot be converted to JSON too. But numpy arrays can be converted to lists easily. An example:
I cannot provide an example for your exact case because your code snippet is not complete enough. But
should do the job.