(Tensorflow 1.14) Not able to convert saved_model.pb and variables in to frozen_inference_graph.pb

199 views Asked by At

I am using tensorflow=1.14. During periodic evaluation, I save best model using tf.estimator.BestExporter. I have following questions.

1) When I try to convert saved_model.pb stored by BestExporter during training to frozen graph using freeze_graph() function, usual input/output node names ("image_tensor" / ['detection_boxes', 'detection_classes', 'detection_scores', 'num_detections']) are not present in saved_model. when I inspect using saved model cli, input / output names are completely different than the saved model stored by export_inference_graph.py using checkpoint and graph with pipeline.config.


export_inference_graph's saved model saverDef

signature_def['serving_default']:                                                                                                                                 
  The given SavedModel SignatureDef contains the following input(s):                                                                                              
    inputs['serialized_example'] tensor_info:                                                                                                                     
        dtype: DT_STRING                                                                                                                                          
        shape: ()                                                                                                                                                 
        name: tf_example:0                                                                                                                                        
  The given SavedModel SignatureDef contains the following output(s):                                                                                             
    outputs['detection_boxes'] tensor_info:                                                                                                                       
        dtype: DT_FLOAT                                                                                                                                           
        shape: (1, 150, 4)                                                                                                                                        
        name: Postprocessor/BatchMultiClassNonMaxSuppression/stack_4:0                                                                                            
    outputs['detection_classes'] tensor_info:                                                                                                                     
        dtype: DT_FLOAT                                                                                                                                           
        shape: (1, 150)                                                                                                                                           
        name: Postprocessor/BatchMultiClassNonMaxSuppression/stack_6:0                                                                                            
    outputs['detection_scores'] tensor_info:                                                                                                                      
        dtype: DT_FLOAT                                                                                                                                           
        shape: (1, 150)                                                                                                                                           
        name: Postprocessor/BatchMultiClassNonMaxSuppression/stack_5:0                                                                                            
    outputs['num_detections'] tensor_info:
        dtype: DT_FLOAT
        shape: (1)
        name: Postprocessor/ToFloat_3:0
  Method name is: tensorflow/serving/predict


**********************************

BestExporter saverdef

signature_def['serving_default']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['inputs'] tensor_info:
        dtype: DT_UINT8
        shape: (-1, -1, -1, 3)
        name: image_tensor:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['detection_boxes'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 150, 4)
        name: detection_boxes:0
    outputs['detection_classes'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 150)
        name: detection_classes:0
    outputs['detection_scores'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 150)
        name: detection_scores:0
    outputs['num_detections'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1)
        name: num_detections:0
  Method name is: tensorflow/serving/predict

As one can see, both have different ip/op names.

2) Alternate approach I tried which is directly using saved_model.pb saved by BestExporter for inference. By inspecting .pb file using saved model cli, input shape it refers to is string with no dimension, which again prevent me using this approach because when passed numpy image for inference, it raised shape mismatch error. (From above)

Can someone help me out how can I use saved_model from BestExporter for inference or convert it to frozen graph with correct ip/op so It can be used for inference.

Let me know If you need more information. Thank you

0

There are 0 answers