Error 404 not found In postman when trying to call predict method for tensorflow-hub

278 views Asked by At

I was testing a tensorflow model on Postman that uses https://tfhub.dev/google/universal-sentence-encoder-multilingual/3 from tensorflow-hub, knowing that it worked perfectly in jupyter notebook without any error, I encountered this error in postman after sending a POST request that calls predict method. Error:

 "error": "{{function_node __inference_signature_wrapper_133703}} {{function_node __inference_signature_wrapper_133703}} {{function_node __inference__wrapped_model_95698}} {{function_node __inference__wrapped_model_95698}} {{function_node __inference_restored_function_body_51031}} {{function_node __inference_restored_function_body_51031}} [_Derived_]{{function_node __inference___call___6286}} {{function_node __inference___call___6286}} Op type not registered \'SentencepieceOp\' in binary running on 329ddc874964. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.\n\t [[{{node StatefulPartitionedCall}}]]\n\t [[StatefulPartitionedCall]]\n\t [[sequential/keras_layer/StatefulPartitionedCall]]\n\t [[StatefulPartitionedCall]]\n\t [[StatefulPartitionedCall]]"

with a status 404 not found. and this is my model:

import tensorflow as tf
import numpy as np
import tensorflow_text
import pandas as pd
import random
import tensorflow_hub as hub
from tensorflow import keras
from tensorflow.keras import layers
from tensorflow.keras import losses
from tensorflow.keras import preprocessing
from tensorflow.keras.layers.experimental.preprocessing import TextVectorization
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import LSTM, Dense, Dropout
from tensorflow.keras.layers import SpatialDropout1D
from tensorflow.keras.layers import Embedding
from tensorflow.keras.preprocessing.text import Tokenizer
from tensorflow.keras.preprocessing.sequence import pad_sequences

module_url = "https://tfhub.dev/google/universal-sentence-encoder-multilingual/3"
# Import the Universal Sentence Encoder's TF Hub module
hub_layer = hub.KerasLayer(module_url, input_shape=[], dtype=tf.string, trainable=True)
#some data preprocessing
opt = keras.optimizers.Adam(learning_rate= 0.001) 

model = tf.keras.Sequential()
model.add(hub_layer)
model.add(tf.keras.layers.Dense(32, activation='relu'))
model.add(tf.keras.layers.Dense(16, activation='relu'))
model.add(tf.keras.layers.Dense(1,activation='sigmoid'))
model.layers[0].trainable = False
model.compile(loss='binary_crossentropy',optimizer=opt, metrics=['accuracy'])
model.summary()
history = model.fit(np.array(tweet),np.array(sentiment),
                  validation_split=0.2, epochs=5, batch_size=32)

This is the request in Postman using localhost:.../arabtextclasstfhubtest:predict :

{ 
"signature_name": "serving_default",
 
    "inputs": 
        {
"keras_layer_input":["كلامك جميل ورائع"]
  }
       
     }

I would like to know if it's a bug in tensoflow-hub or how to fix this problem. thank you!!

1

There are 1 answers

0
arnoegw On

It appears you are exporting a SavedModel to a server binary that answers the Postman requests. That server binary needs to link in the 'SentencepieceOp' from tensorflow_text, because your SavedModel uses it (as it should).