Serving tensorflow model trained with files

104 views Asked by At

Curious if anyone has the similar use cases as mine:

My tensor flow model is trained with tfrecord files and queue runner. As such the graph does not use placeholders.

Now how can I save the model and service it online? As during serving, we need feed the requested data into the graph. If there is no placeholder, then we have no place to feed.

Thanks!

1

There are 1 answers

0
Tianjin Gu On

Actually TensorFlow accept a Tensor use as a placeholder, for example:

q = tf.FIFOQueue(10, dtypes=tf.int32)
a = q.dequeue()
w = tf.constant(2)
c = a * w
sess = tf.Session()
sess.run(c, feed_dict={a:1})

So the input does not have to be a placeholder when exporting model, you can make any tensor after dequeue as input for your Serving.