Closed jacobmalmberg closed 1 year ago
The prepacked Tensorflow server with the Seldon protocol uses a proxy.
I think we would need to extend this to ensure the metadata is handled by the proxy. It may not presently have access to the downloaded artifacts.
Access to artifacts may be one thing, but we should also extend TfServingProxy
class to implement init_metadata
like we have here for example https://github.com/SeldonIO/seldon-core/blob/master/servers/sklearnserver/sklearnserver/SKLearnServer.py#L53-L66
SKLearnServer
server do get model_uri
as one of the parameters, the TfServingProxy
does not.
Closing
Describe the bug
Placing a metadata.yaml file with metadata information about the model in the model s3 bucket does not work when using the prepackaged tensorflow server and the seldon protocol. When exectuing
curl service:/api/v1.0/metadata | jq .
this metadata (see "to reproduce" below for exact yaml) should be presented but instead I getMetadata is not imported from metadata.yaml but are seemingly taken from the image name of the model container (seldonio/tfserving-proxy:1.12.0-dev). According to https://docs.seldon.io/projects/seldon-core/en/latest/referenceapis/metadata.html#prepackaged-model-servers, the metadata presented should be from metadata.yaml.
To reproduce
I run the mnist example from https://docs.seldon.io/projects/seldon-core/en/latest/servers/tensorflow.html with an extra metadata.yaml file in the bucket.
Metadata.yaml
Expected behaviour
curl mnist-model-default:8000/api/v1.0/metadata | jq . should yield
Environment
Seldon 1.12.0-dev
value: docker.io/seldonio/engine:1.12.0-dev value: seldonio/seldon-core-executor:1.12.0-dev image: seldonio/seldon-core-operator:1.12.0-dev
Model Details