Open enmanuelmag opened 1 year ago
Hi @enmanuelmag, could you share more about your project structure, the content in bentofile.yaml
, and especially what's the service
filed in there?
@parano this is my bentofile.yaml
service: 'application/src/create_service.py:service'
include:
- config
- application/src/
- Procfile
python:
packages:
- bentoml==1.0.17
- hydra-core==1.3.2
- numpy==1.23.5
- patsy==0.5.3
- pydantic==1.10.7
- opencv-python==4.7.0.72
- tensorflow==2.10.1
- pytesseract==0.3.10
And this is the prokect structure:
Project-folder/
┣ application/
┃ ┣ src/
┃ ┃ ┣ create_service.py
┃ ┃ ┗ __init__.py
┃ ┣ tests/
┃ ┃ ┣ input.png
┃ ┃ ┣ test_create_service.py
┃ ┃ ┗ __init__.py
┃ ┃ ┗ __init__.cpython-39.pyc
┃ ┣ requirements.txt
┃ ┗ __init__.py
┣ config/
┃ ┗ main.yaml
┣ .dvcignore
┣ .flake8
┣ .gitignore
┣ .pre-commit-config.yaml
┣ bentofile.yaml
┣ data.dvc
┣ dev-requirements.txt
┣ Makefile
┣ metrics.csv
┣ models.dvc
┣ outputs.dvc
┣ params.yml
┣ poetry.lock
┣ pyproject.toml
┣ README.md
┣ requirements.txt
┗ server.bat
@parano In addition, how I could hanlder mutiple images? I have been trying is this way but only detect or the server recieved one image:
from bentoml.io import Multipart
from bentoml.io import Image
from bentoml.io import NumpyNdarray, JSON
INPUT_SPEC = Multipart(images=Image())
OUTPUT_SPEC = JSON()
@service.api(input=INPUT_SPEC, output=OUTPUT_SPEC)
def predict(images: np.ndarray):
"""Transform the data then make predictions"""
print(images)
# ...rest of the predictor code
I sent the images is this way:
import requests
from requests_toolbelt.multipart.encoder import MultipartEncoder
m = MultipartEncoder(
fields={
"field1": ("filename", open("./multi/input_1.png", "rb"), "image/png"),
"field2": ("filename", open("./multi/input_2.png", "rb"), "image/png"),
"field3": ("filename", open("./multi/input_3.png", "rb"), "image/png"),
"field4": ("filename", open("./multi/input_4.png", "rb"), "image/png"),
}
)
response = requests.post(
"http://localhost:3000/predict",
data=m,
headers={"Content-Type": m.content_type},
)
print(response.json())
The response is this, because is not a iterable
I was searching an example with multiple images on your repo but only found with once.
@enmanuelmag Did you ever figure out a solution to this? Running into the same thing with bentoml==1.0.25
.
You need to install the missing libs in the container, via custom docker templates or setup.sh:
Describe the bug
I am trying to start a server with docker. The problem is that I am getting the following error:
However, if I run are docker the server, that is, from my native terminal
bentoml serve
in my project folder, the server does work.Clearly, I have previously executed the commands
bento build
andbentoml containerize
.This is the enire output error on docker:
To reproduce
No response
Expected behavior
No response
Environment
bentoml==1.0.17 python==3.9 platflow=windows 11 (docker on WSL 2.0)