Tensorflow serving post api calling

52 views Asked by At

HOW TO HANDLE THIS ERROR: "Failed to get input map for signature: serving_default"

This error typically occurs when there is a mismatch between the format of the input data provided to the TensorFlow Serving API and the expected input format defined in the model's signature. To resolve this issue, follow these steps:

  1. Check Model Signature: Ensure that you are providing the input data in the correct format as expected by the model's signature. The input format (e.g., data type, shape) should match the signature's input requirements.

  2. Data Preprocessing: If the model expects specific preprocessing for input data (e.g., resizing, normalization), make sure you preprocess the image accordingly before sending it to the model.

  3. JSON Request Format: Ensure that the JSON request you're sending to TensorFlow Serving is properly formatted. The instances field should contain a list of inputs, where each input matches the expected format of the model's input signature.

  4. Check Model Version: If you're using a specific version of the model, make sure you specify the correct version in the API request (url variable).

Here's an example of properly formatted code to send an image for prediction using TensorFlow Serving:

import requests
import json
import numpy as np
import base64
import cv2

# Replace this with the actual image path you want to test
image_path = 'H_L_.jpg'

# Read and preprocess the image
image = cv2.imread(image_path)
image = cv2.resize(image, (256, 256))
image = image.astype(np.float32) / 255.0
image = np.expand_dims(image, axis=0)

# Convert the NumPy array to bytes before encoding
encoded_image = base64.b64encode(image.tobytes()).decode('utf-8')

# Prepare the JSON request with the signature name
data = {
    "signature_name": "serving_default",
    "instances": [{"input_1": encoded_image}]  # Adjust the input key based on your model's signature
}
# Replace these labels with your actual labels
labels = ['Potato___Early_blight', 'Potato___Late_blight', 'Potato___healthy']

# Send the inference request to TensorFlow Serving
url = 'http://localhost:8501/v1/models/model:predict'  # Replace 'model' with the actual model name and version
headers = {"content-type": "application/json"}
response = requests.post(url, data=json.dumps(data), headers=headers)

# Process the response
if response.status_code == 200:
    predictions = response.json()['predictions'][0]
    predicted_class_idx = np.argmax(predictions)
    predicted_label = labels[predicted_class_idx]
    print("Predicted Label:", predicted_label)
    print("Class Probabilities:", predictions)
else:
    print("Error: Unable to get predictions. Status code:", response.status_code)
    print("Response content:", response.content)
1

There are 1 answers

0
zzachimonde On

It seems that your model didn't have a signature called signature_default. Would you please show the result of saved_model_cli show --dir /path/to/your/model --all?