bentoml / BentoML

The easiest way to serve AI apps and models - Build Model Inference APIs, Job queues, LLM apps, Multi-model pipelines, and more!
https://bentoml.com
Apache License 2.0
7.15k stars 792 forks source link

bug: Server dont load on Docker but on native terminal it works #3758

Open enmanuelmag opened 1 year ago

enmanuelmag commented 1 year ago

Describe the bug

I am trying to start a server with docker. The problem is that I am getting the following error:

[bentoml-cli] `serve` failed: Failed loading Bento from directory /home/bentoml/bento: Failed to import module "application.src.serve_model": libGL.so.1: cannot open shared object file: No such file or directory

However, if I run are docker the server, that is, from my native terminal bentoml serve in my project folder, the server does work.

Clearly, I have previously executed the commands bento build and bentoml containerize.

This is the enire output error on docker:

Error: [bentoml-cli] `serve` failed: Failed loading Bento from directory /home/bentoml/bento: Failed to import module "application.src.serve_model": libGL.so.1: cannot open shared object file: No such file or directory
Traceback (most recent call last):
  File "/usr/local/bin/bentoml", line 8, in <module>
    sys.exit(cli())
  File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1130, in __call__
    return self.main(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1055, in main
    rv = self.invoke(ctx)
  File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1657, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/usr/local/lib/python3.9/site-packages/click/core.py", line 1404, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/lib/python3.9/site-packages/click/core.py", line 760, in invoke
    return __callback(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/bentoml_cli/utils.py", line 339, in wrapper
    raise err from None
  File "/usr/local/lib/python3.9/site-packages/bentoml_cli/utils.py", line 334, in wrapper
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/bentoml_cli/utils.py", line 305, in wrapper
    return_value = func(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/bentoml_cli/utils.py", line 262, in wrapper
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/bentoml_cli/env_manager.py", line 122, in wrapper
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/bentoml_cli/serve.py", line 195, in serve
    serve_http_production(
  File "/usr/local/lib/python3.9/site-packages/simple_di/__init__.py", line 139, in _
    return func(*_inject_args(bind.args), **_inject_kwargs(bind.kwargs))
  File "/usr/local/lib/python3.9/site-packages/bentoml/serve.py", line 322, in serve_http_production
    svc = load(bento_identifier, working_dir=working_dir, standalone_load=True)
  File "/usr/local/lib/python3.9/site-packages/bentoml/_internal/service/loader.py", line 334, in load
    raise BentoMLException(
bentoml.exceptions.BentoMLException: Failed loading Bento from directory /home/bentoml/bento: Failed to import module "application.src.serve_model": libGL.so.1: cannot open shared object file: No such file or directory

To reproduce

No response

Expected behavior

No response

Environment

bentoml==1.0.17 python==3.9 platflow=windows 11 (docker on WSL 2.0)

parano commented 1 year ago

Hi @enmanuelmag, could you share more about your project structure, the content in bentofile.yaml, and especially what's the service filed in there?

enmanuelmag commented 1 year ago

@parano this is my bentofile.yaml

service: 'application/src/create_service.py:service'
include:
  - config
  - application/src/
  - Procfile
python:
  packages:
    - bentoml==1.0.17
    - hydra-core==1.3.2
    - numpy==1.23.5
    - patsy==0.5.3
    - pydantic==1.10.7
    - opencv-python==4.7.0.72
    - tensorflow==2.10.1
    - pytesseract==0.3.10

And this is the prokect structure:

Project-folder/
┣ application/
┃ ┣ src/
┃ ┃ ┣ create_service.py
┃ ┃ ┗ __init__.py
┃ ┣ tests/
┃ ┃ ┣ input.png
┃ ┃ ┣ test_create_service.py
┃ ┃ ┗ __init__.py
┃ ┃ ┗ __init__.cpython-39.pyc
┃ ┣ requirements.txt
┃ ┗ __init__.py
┣ config/
┃ ┗ main.yaml
┣ .dvcignore
┣ .flake8
┣ .gitignore
┣ .pre-commit-config.yaml
┣ bentofile.yaml
┣ data.dvc
┣ dev-requirements.txt
┣ Makefile
┣ metrics.csv
┣ models.dvc
┣ outputs.dvc
┣ params.yml
┣ poetry.lock
┣ pyproject.toml
┣ README.md
┣ requirements.txt
┗ server.bat
enmanuelmag commented 1 year ago

@parano In addition, how I could hanlder mutiple images? I have been trying is this way but only detect or the server recieved one image:

from bentoml.io import Multipart
from bentoml.io import Image
from bentoml.io import NumpyNdarray, JSON

INPUT_SPEC = Multipart(images=Image())
OUTPUT_SPEC = JSON()

@service.api(input=INPUT_SPEC, output=OUTPUT_SPEC)
def predict(images: np.ndarray):
    """Transform the data then make predictions"""
    print(images)
    # ...rest of the predictor code

I sent the images is this way:

import requests

from requests_toolbelt.multipart.encoder import MultipartEncoder

m = MultipartEncoder(
    fields={
        "field1": ("filename", open("./multi/input_1.png", "rb"), "image/png"),
        "field2": ("filename", open("./multi/input_2.png", "rb"), "image/png"),
        "field3": ("filename", open("./multi/input_3.png", "rb"), "image/png"),
        "field4": ("filename", open("./multi/input_4.png", "rb"), "image/png"),
    }
)

response = requests.post(
    "http://localhost:3000/predict",
    data=m,
    headers={"Content-Type": m.content_type},
)

print(response.json())

The response is this, because is not a iterable image

I was searching an example with multiple images on your repo but only found with once.

ghunkins commented 1 year ago

@enmanuelmag Did you ever figure out a solution to this? Running into the same thing with bentoml==1.0.25.

frostming commented 4 months ago

You need to install the missing libs in the container, via custom docker templates or setup.sh:

https://stackoverflow.com/a/63377623