Closed cceasy closed 5 days ago
model_name = Path(file).parent.parent.parent.stem
I think the problem is caused by this line, you get the service name dynamically from the parent directory name, but the directory won't be packed into the docker image. So when containerized, the service name is not likely the same as what is frozen in the bento. Therefore the error is explained.
You can close this issue if no other concerns.
Thanks for the quick response @frostming
Describe the bug
Hi there, I have encountered a problem with bentoml containerized run, but I could successfully run by using bentoml serve.
To reproduce
I could use
bentoml serve service:ModelService
to run the model below, and I could build docker image bybentoml containerize --opt platform=linux/amd64 xxx
, but I failed to run it withdocker run --rm --platform linux/amd64 xxx
, here is the error message while doing docker run:Files as below:
description: "A Bento Service for my keras bert model."
python: requirements_txt: "requirements.txt" lock_packages: false
include:
import bentoml from starlette.responses import JSONResponse from starlette.status import HTTP_500_INTERNAL_SERVER_ERROR from pydantic import RootModel from pathlib import Path import os
model_name = Path(file).parent.parent.parent.stem
class Input(RootModel[list]): pass
@bentoml.service( name=model_name, traffic={"timeout": 10}, ) class ModelService: def init(self): super().init() model_tag = "{}:{}".format(os.environ.get("MODEL_NAME"), os.environ.get("MODEL_VERSION")) self.model = bentoml.keras.load_model(model_tag)
import os import bentoml
os.environ["KERAS_BACKEND"] = "jax" # Or "tensorflow" or "torch"
import tensorflow_datasets as tfds import keras_nlp
imdb_train, imdb_test = tfds.load( "imdb_reviews", split=["train", "test"], as_supervised=True, batch_size=16, )
Load a model.
classifier = keras_nlp.models.BertClassifier.from_preset( "bert_tiny_en_uncased", num_classes=2, activation="softmax", ) classifier.fit(imdb_train.take(250), validation_data=imdb_test.take(250))
Predict new examples.
classifier.predict(["What an amazing movie!", "A total waste of my time."])
bentoml.keras.save_model("my-test-model:v0.0.1", classifier)