I just a newbie. I have problem when I serving tensorflow model in this case:
I. Using this http://opennmt.net/OpenNMT-tf/quickstart.html to train the model.
II. Serving the model with following steps:
- Create docker image with:
docker build --pull -t $USER/tensorflow-serving-devel -f tensorflow_serving/tools/docker/Dockerfile.devel .
- Run docker container:
docker run --name=tf_container -it $USER/tensorflow-serving-devel
- Serving the model:
tensorflow_model_server --port=9000 --model_name=model_name --model_base_path=/model_file &> result_log &
III.The result_log file content:
2019-10-21 02:46:12.840258: I tensorflow_serving/core/loader_harness.cc:155] Encountered an error for servable version {name: ente version: 1569320347}: Not found: Op type not registered 'GatherTree' in binary running on 1b79e5fb3ac4. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
2019-10-21 02:46:12.840280: E tensorflow_serving/core/aspired_versions_manager.cc:359] Servable {name: ente version: 1569320347} cannot be loaded: Not found: Op type not registered 'GatherTree' in binary running on 1b79e5fb3ac4. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
2019-10-21 02:46:13.664569: I tensorflow_serving/core/basic_manager.cc:280] Unload all remaining servables in the manager.
Failed to start server. Error: Unknown: 1 servable(s) did not become available: {{{name: ente version: 1569320347} due to error: Not found: Op type not registered 'GatherTree' in binary running on 1b79e5fb3ac4. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.}, } ```
I have searched Google and try to update some services, but the problem still here. Anyone have any idea please?
Thanks so much for any suggestions!
With the transition to TensorFlow 2.0, the
GatherTree
op that is used in beam search is currently not available in TensorFlow Serving.GatherTree
fromtf.contrib
which was removed in recent versions of TensorFlow Serving. You should use a previous version of TensorFlow Serving such as 1.15.0.Addons>GatherTree
from TensorFlow Addons which is presently not integrated in TensorFlow Serving. This is a work in progress. There are currently 2 workarounds:opennmt/tensorflow-serving:2.1.0
which is a custom Serving build that includes this op.