How to obtain gcloud predictions by passing a base64 image to a retrained inception model?

455 views Asked by At

I am trying to obtain a prediction with gcloud by passing a base64 encoded image to a retrained inception model, by using a similar approach as the one adopted by Davide Biraghi in this post. When using 'DecodeJpeg/contents:0' as input I also get the same error when trying to get predictions, therefore I adopted a slightly different approach.

Following rhaertel80's suggestions in his answer to this post, I have created a graph that takes a jpeg image as input in 'B64Connector/input', preprocess it and feeds it to the inception model in 'ResizeBilinear:0'.

The prediction returns the values, although the wrong ones (I am trying to find a solution in another post) but at least it doesn't fail. The placeholder I use as input is

images_placeholder = tf.placeholder(dtype=tf.string, shape=(None,), name='B64Connector/input')

And I add it to the model inputs with

inputs = {"b64_bytes": 'B64Connector/input:0'}
tf.add_to_collection("inputs", json.dumps(inputs))

As Davide I am following the suggestions found in these posts: here, here and here and I am trying to get predictions with

    gcloud beta ml predict --json-instances=request.json --model=MODEL

where the file request.json has been obtained with this code

jpgtxt = base64.b64encode(open(imagefile ,"rb").read())

with open( outputfile, 'w' ) as f :
  f.write( json.dumps( {"b64_bytes": {"b64": jpgtxt}} ) )

I would like to know why the prediction fails when I use as input 'DecodeJpeg/contents:0' and it doesn't when I use this different approach, since they look almost identical to me: I use the same script to generate the instances (changing the input_key) and the same command line to request predictions

Is there a way to pass the instance fed to 'B64Connector/input:0' to 'DecodeJpeg/contents:0' in order to get the right predictions?

1

There are 1 answers

0
EffePi On

Here I describe more in detail my approach and how I use the images_placeholder.

I define a function that resizes the image:

  def decode_and_resize(image_str_tensor):
    """Decodes jpeg string, resizes it and returns a uint8 tensor."""

    image = tf.image.decode_jpeg(image_str_tensor, channels=MODEL_INPUT_DEPTH)

    # Note resize expects a batch_size, but tf_map supresses that index,
    # thus we have to expand then squeeze.  Resize returns float32 in the
    # range [0, uint8_max]
    image = tf.expand_dims(image, 0)
    image = tf.image.resize_bilinear(
        image, [MODEL_INPUT_HEIGHT, MODEL_INPUT_WIDTH], align_corners=False)
    image = tf.squeeze(image, squeeze_dims=[0])
    image = tf.cast(image, dtype=tf.uint8)
    return image

and one that generates the definition of the graph in which the resize takes place and where images_placeholder is defined and used

def create_b64_graph() :
  with tf.Graph().as_default() as b64_graph:

    images_placeholder = tf.placeholder(dtype=tf.string, shape=(None,),
                                     name='B64Connector/input')
    decoded_images = tf.map_fn(
        decode_and_resize, images_placeholder, back_prop=False, dtype=tf.uint8)

    # convert_image_dtype, also scales [0, uint8_max] -> [0, 1).
    images = tf.image.convert_image_dtype(decoded_images, dtype=tf.float32)

    # Finally, rescale to [-1,1] instead of [0, 1)
    images = tf.sub(images, 0.5)
    images = tf.mul(images, 2.0)

    # NOTE: using identity to get a known name for the output tensor.
    output = tf.identity(images, name='B64Connector/output')

    b64_graph_def = b64_graph.as_graph_def()

    return b64_graph_def

Moreover I am using the following code to merge the resizing graph with the inception graph. Can I use a similar approach to link images_placeholder directly to 'DecodeJpeg/contents:0'?

def concatenate_to_inception_graph( b64_graph_def ):

  model_dir = INPUT_MODEL_PATH
  model_filename = os.path.join(
       model_dir, 'classify_image_graph_def.pb')

  with tf.Session() as sess:

    # Import the b64_graph and get its output tensor
    resized_b64_tensor, = (tf.import_graph_def(b64_graph_def, name='',
                             return_elements=['B64Connector/output:0']))

    with gfile.FastGFile(model_filename, 'rb') as f:
      inception_graph_def = tf.GraphDef()
      inception_graph_def.ParseFromString(f.read())

      # Concatenate b64_graph and inception_graph
      g_1 = tf.import_graph_def(inception_graph_def, name='inception',
               input_map={'ResizeBilinear:0' : resized_b64_tensor} )

    return sess.graph