Curious if anyone has the similar use cases as mine:
My tensor flow model is trained with tfrecord files and queue runner. As such the graph does not use placeholders.
Now how can I save the model and service it online? As during serving, we need feed the requested data into the graph. If there is no placeholder, then we have no place to feed.
Thanks!
Actually
TensorFlow
accept aTensor
use as aplaceholder
, for example:So the input does not have to be a
placeholder
when exporting model, you can make any tensor after dequeue as input for your Serving.