emacski / tensorflow-serving-arm

TensorFlow Serving ARM - A project for cross-compiling TensorFlow Serving targeting popular ARM cores
Apache License 2.0
99 stars 16 forks source link

Start Tensorflow-serving with model and copy model into container #2

Closed NickHauptvogel closed 4 years ago

NickHauptvogel commented 4 years ago

Hi,

if I start your container with a saved model bound to /models/, I receive the error: E external/tf_serving/tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:362] FileSystemStoragePathSource encountered a filesystem access error: Could not find base path /models/model for servable model

Even though I did not name my model "model" and didn't specify the path to "models/model". If I do name my model that way, it works as a workaround.

How do I now copy a saved model into /models folder of the container? It does not just let me start the container without binding any folder to /models/model because it cannot find any resource. If I bind it first, copy it afterwards and then create an image, it does not work either: W external/tf_serving/tensorflow_serving/sources/storage_path/file_system_storage_path_source.cc:267] No versions of servable model found under base path /models/model

emacski commented 4 years ago

The name "model" is used in a couple default options to the model server startup command.

If your run command looks similar to this:

docker run --rm --init -v my_saved_model:/models/my_saved_model emacski/tensorflow-serving:1.14.0

Then you should be able override the defaults as such:

docker run --rm --init -v my_saved_model:/models/my_saved_model emacski/tensorflow-serving:1.14.0 --model_name=my_saved_model --model_base_path=/models/my_saved_model
NickHauptvogel commented 4 years ago

If I run it like this: docker run --rm --init -v my_saved_model:/models/my_saved_model emacski/tensorflow-serving:1.14.0 --model_name=my_saved_model --model_base_path=/models/my_saved_model

I only get an error similar to the first one: FileSystemStoragePathSource encountered a filesystem access error: Could not find base path /models/my_saved_model for servable my_saved_model

Which path do I have to add after -v? Is it the path on my local machine or inside the container? If inside, how do I get my model there (otherwise error: No servable found)?

Regards

NickHauptvogel commented 4 years ago

Still the only way it works is the following, but only by mounting the model, I need to get it inside the container:

docker run -p 8501:8501 --mount type=bind,source=/tmp/tfserving/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two_cpu,target=/models/model -e MODEL_NAME=model -t emacski/tensorflow-serving:latest-linux_arm &

NickHauptvogel commented 4 years ago

To be precise, I only need to copy the model inside the container instead of mounting it via a local path, how do I achieve this?

emacski commented 4 years ago

To me, the easiest and most straight forward way is to just bake the model into a new image using the default paths with the parent image being emacski/tensorflow-serving.

(I opt for stateless and ephemeral containers whenever possible which is why I focus on images)

create a Dockerfile with the following

FROM emacski/tensorflow-serving
COPY /tmp/tfserving/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two_cpu /models/model

and then something like this in the same directory as the Dockerfile

docker build -t halfplus2cpu .

and now you can use your halfplus2cpu image just like the emacski/tensorflow-serving image without any additional volume mapping.

Other Approaches / Examples

Custom Naming in Image

Same steps as above, but with following contents in the Dockerfile

FROM emacski/tensorflow-serving
COPY /tmp/tfserving/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two_cpu /saved_model_half_plus_two_cpu
CMD ["--model_name=saved_model_half_plus_two_cpu", "--model_base_path=/saved_model_half_plus_two_cpu"]

Copy to Existing Container

You can actually copy files and folders to an existing container if this is what you really want. Just note that, essentially, these steps need to be repeated every time a new container is created since the filesystem modification doesn't actually exist in the image.

docker create --name tfserve -p 8501:8501 --init emacski/tensorflow-serving --model_name=saved_model_half_plus_two_cpu --model_base_path=/saved_model_half_plus_two_cpu
docker cp /tmp/tfserving/serving/tensorflow_serving/servables/tensorflow/testdata/saved_model_half_plus_two_cpu tfserve:/saved_model_half_plus_two_cpu
# start in background
docker start tfserve
# then tail logs
docker logs tfserve

# start in foreground
docker start -a tfserve
NickHauptvogel commented 4 years ago

Hi,

docker cp did work for me, although not with /models/ folder, as it does not exist if not created at startup. I therefore used /home/ to store my model in as a quick workaround. A custom Dockerfile would solve this for sure!

Thank you and regards,

Nick