How to create Tensorflow serving_input_receiver_fn with multiple features?

2.8k views Asked by At

In the TF guide on saving models there is a paragraph on serving_input_receiver_fn that talks about implementing functions for preprocessing logic. I'm trying to do some normalization of input data for a DNNRegressor. Their code for the function looks like this:

feature_spec = {'foo': tf.FixedLenFeature(...),
                'bar': tf.VarLenFeature(...)}

def serving_input_receiver_fn():
  """An input receiver that expects a serialized tf.Example."""
  serialized_tf_example = tf.placeholder(dtype=tf.string,
                                         shape=[default_batch_size],
                                         name='input_example_tensor')
  receiver_tensors = {'examples': serialized_tf_example}
  features = tf.parse_example(serialized_tf_example, feature_spec)
  return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)

My code looks like this:

feat_cols = [
    tf.feature_column.numeric_column(key="FEATURE1"),
    tf.feature_column.numeric_column(key="FEATURE2")
]

def serving_input_receiver_fn():
    feature_spec = tf.feature_column.make_parse_example_spec(feat_cols)

    default_batch_size = 1

    serialized_tf_example = tf.placeholder(dtype=tf.string, shape=[default_batch_size], name='tf_example')

    receiver_tensors = { 'examples': serialized_tf_example}

    features = tf.parse_example(serialized_tf_example, feature_spec)

    fn_norm1 = lamba FEATURE1: normalize_input_data('FEATURE1', FEATURE1)
    fn_norm2 = lamba FEATURE2: normalize_input_data('FEATURE2', FEATURE2)
    features['FEATURE1'] = tf.map_fn(fn_norm1, features['FEATURE1'], dtype=tf.float32)
    features['FEATURE2'] = tf.map_fn(fn_norm2, features['FEATURE2'], dtype=tf.float32)

    return tf.estimator.export.ServingInputReceiver(features, receiver_tensors)

After all of that the saved model has none of my features in the graph. I'm trying to figure out how this works if you have more than one feature you are trying to pass.

I created an example using the keras MPG data. It is located here:

1

There are 1 answers

0
Siyuan Ren On

features in ServingInputReceiver is passed directly to your model function. What you want is receive_tensors or receive_tensor_alternatives, that is, the second and third argument to ServingInputReceiver constructor.

For example, you may do this

serialized_tf_example = tf.placeholder(dtype=tf.string, shape=[default_batch_size], name='tf_example')

receiver_tensors = { 'examples': serialized_tf_example}

raw_features = tf.parse_example(serialized_tf_example, feature_spec)

fn_norm1 = lamba FEATURE1: normalize_input_data('FEATURE1', FEATURE1)
fn_norm2 = lamba FEATURE2: normalize_input_data('FEATURE2', FEATURE2)
features['FEATURE1'] = tf.map_fn(fn_norm1, raw_features['FEATURE1'], dtype=tf.float32)
features['FEATURE2'] = tf.map_fn(fn_norm2, raw_features['FEATURE2'], dtype=tf.float32)
return tf.estimator.export.ServingInputReceiver(
      features=features,
      receiver_tensors=receiver_tensors,
      receiver_tensors_alternatives={'SOME_KEY': raw_features})

If you don't ever need to feed the network with an Example proto, you can skip it entirely.

raw_features = {'FEATURE1': tf.placeholder(...), 'FEATURE2': tf.placeholder(...)}
features = preprepocess(raw_features)
return tf.estimator.export.ServingInputReceiver(features, {'SOME_OTHER_KEY': raw_features})