I'm trying to execute this normal tf_serving command (which work correctly) with docker version of tf_serving. I'm not sure why it's not working.. Any suggestion? I'm new to Docker!

Normal tf_serving command:

tensorflow_model_server \
--model_config_file=/opt/tf_serving/model_config.conf \
--port=6006

here is what my model_config.conf looks like:

model_config_list: {
  config: {
    name: "model_1",
    base_path: "/opt/tf_serving/model_1",
    model_platform: "tensorflow",
  },
  config: {
    name: "model_2",
    base_path: "/opt/tf_serving/model_2",
    model_platform: "tensorflow",
  },
}

Docker version of command that I'm trying but not working:

docker run --runtime=nvidia \
-p 6006:6006 \
--mount type=bind,source=/opt/tf_serving/model_1,target=/models/model_1/ \
--mount type=bind,source=/opt/tf_serving/model_2,target=/models/model_2/ \
--mount type=bind,source=/opt/tf_serving/model_config.conf,target=/config/model_config.conf \
-t tensorflow/serving:latest-gpu --model_config_file=/config/model_config.conf

Error:

2019-04-13 19:41:00.838340: E tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:369] FileSystemStoragePathSource encountered a file-system access error: Could not find base path /opt/tf_serving/model_1 for servable model_1

1 Answers

0
Snehal On Best Solutions

Found the issue! You have to change the models path in model_config.conf as follow, and the above docker command will work and load both models!

model_config_list: {
  config: {
    name: "model_1",
    base_path: "/models/model_1",
    model_platform: "tensorflow",
  },
  config: {
    name: "model_2",
    base_path: "/models/model_1",
    model_platform: "tensorflow",
  },
}