I have a deployed model on Google AI Platform, as evidenced by this picture:
This was a model built with Keras and saved using the save_model
command with standard options used.
When I go to test the model virtually to see if it works I'll put in a sample JSON request like so:
{"instances": [
{"values": ["This is my first sentence"], "key": 1}
]}
I'm following the examples given at this URL: https://cloud.google.com/ai-platform/prediction/docs/online-predict?hl=en_US#formatting_your_input_for_online_prediction
When I input the sample JSON request into the evaluator like this:
I get the following error message:
{"error": "{\n \"error\": \"Failed to process element: 0 key: values of 'instances' list. Error: Invalid argument: JSON object: does not have named input: values\"\n}"}
After some looking it looks like the issue is with tf.Serving, because the JSON input is expecting something other than a "values" key to make its prediction.
My problem is I don't know how to access what that should be.
My best attempt was to re-load the model locally and call the get_config()
method to see if there was anything there.
That returned the following dictionary:
{'name': 'functional_1',
'layers': [{'class_name': 'InputLayer',
'config': {'batch_input_shape': (None, 1),
'dtype': 'string',
'sparse': False,
'ragged': False,
'name': 'input_1'},
'name': 'input_1',
'inbound_nodes': []},
{'class_name': 'TextVectorization',
'config': {'name': 'text_vectorization',
'trainable': True,
'dtype': 'string',
'max_tokens': 12500,
'standardize': 'lower_and_strip_punctuation',
'split': 'whitespace',
'ngrams': None,
'output_mode': 'int',
'output_sequence_length': 250,
'pad_to_max_tokens': True},
'name': 'text_vectorization',
'inbound_nodes': [[['input_1', 0, 0, {}]]]},
{'class_name': 'Embedding',
'config': {'name': 'embedding',
'trainable': True,
'batch_input_shape': (None, None),
'dtype': 'float32',
'input_dim': 12500,
'output_dim': 25,
'embeddings_initializer': {'class_name': 'RandomUniform',
'config': {'minval': -0.05, 'maxval': 0.05, 'seed': None}},
'embeddings_regularizer': None,
'activity_regularizer': None,
'embeddings_constraint': None,
'mask_zero': False,
'input_length': None},
'name': 'embedding',
'inbound_nodes': [[['text_vectorization', 0, 0, {}]]]},
{'class_name': 'Flatten',
'config': {'name': 'flatten',
'trainable': True,
'dtype': 'float32',
'data_format': 'channels_last'},
'name': 'flatten',
'inbound_nodes': [[['embedding', 0, 0, {}]]]},
{'class_name': 'Dense',
'config': {'name': 'dense',
'trainable': True,
'dtype': 'float32',
'units': 50,
'activation': 'relu',
'use_bias': True,
'kernel_initializer': {'class_name': 'GlorotUniform',
'config': {'seed': None}},
'bias_initializer': {'class_name': 'Zeros', 'config': {}},
'kernel_regularizer': None,
'bias_regularizer': None,
'activity_regularizer': None,
'kernel_constraint': None,
'bias_constraint': None},
'name': 'dense',
'inbound_nodes': [[['flatten', 0, 0, {}]]]},
{'class_name': 'Dense',
'config': {'name': 'dense_1',
'trainable': True,
'dtype': 'float32',
'units': 50,
'activation': 'relu',
'use_bias': True,
'kernel_initializer': {'class_name': 'GlorotUniform',
'config': {'seed': None}},
'bias_initializer': {'class_name': 'Zeros', 'config': {}},
'kernel_regularizer': None,
'bias_regularizer': None,
'activity_regularizer': None,
'kernel_constraint': None,
'bias_constraint': None},
'name': 'dense_1',
'inbound_nodes': [[['dense', 0, 0, {}]]]},
{'class_name': 'Dense',
'config': {'name': 'dense_2',
'trainable': True,
'dtype': 'float32',
'units': 1,
'activation': 'sigmoid',
'use_bias': True,
'kernel_initializer': {'class_name': 'GlorotUniform',
'config': {'seed': None}},
'bias_initializer': {'class_name': 'Zeros', 'config': {}},
'kernel_regularizer': None,
'bias_regularizer': None,
'activity_regularizer': None,
'kernel_constraint': None,
'bias_constraint': None},
'name': 'dense_2',
'inbound_nodes': [[['dense_1', 0, 0, {}]]]}],
'input_layers': [['input_1', 0, 0]],
'output_layers': [['dense_2', 0, 0]]}
I was hoping some of the info I was looking for would be contained here, and I've tried something things like 'functional_1'
and input_1
as the keys to use but with no success.
I've also tried the original column used for X
in the dataset but that did not work.
How do I access the metadata for tf.Serving to know what to put into my JSON request?
The format information for your input must be contained within the definition of your Keras model.
For example, in the Quickstart of Training and prediction with Keras, a model is created and trained using information from the United States Census Income Dataset. The code for this example is in this GitHub repo.
Inside
util.py
, the dataset information is prepared, where only these columns will be used:Then, the preprocessing leaves the data like this (for the training stage):
-0.48454606533050537, 3.0, -0.030303768813610077, 2.0, 6.0, 0.0, 4.0, -0.1447920799255371, -0.2171318680047989, 0.6115681529045105, 38.0
Therefore, the input for the prediction must comply with the same format. At the same time, if you want to use the feature test from the UI of your model within the Cloud Console, it is necessary to send the input within a JSON object like this:
However and according to this documentation, as each data instance is a vector of floating values (in this case), the JSON must be like this: