petteriTeikari / minivess_mlops

Charissa Poon, Petteri Teikari et al. (2023): "A dataset of rodent cerebrovasculature from in vivo multiphoton fluorescence microscopy imaging", Scientific Data 10, 141 doi: 10.1038/s41597-023-02048-8
http://doi.org/10.1038/s41597-023-02048-8
2 stars 0 forks source link

BentoML using custom project structure #13

Closed petteriTeikari closed 12 months ago

petteriTeikari commented 12 months ago

Getting the following error:

Error: [bentoml-cli] `build` failed: Failed to import module "service": No module named 'service'

When wanting to do a bento build (with a Custom build context) as followed (from the project root):

bentoml build -f deployment/bentoml/bentofile.yaml ./src/

With the following project structure:

├── deployment
│   └── bentoml
│       ├── bentofile.yaml
│       ├── service.py
├── src
│   └── inference
│       ├── __init__.py
├── __init__.py
│ README.md

bento build seems to work followed by bentoml serve service.py:svc:

image

With no ModuleNotFoundError: No module named 'src.inference' error which comes up with the containerization (bentoml containerize minivess-segmentor:a6bgb6d3nob4nj6o --no-cache) as the context is not correct

service.py

import os
import sys
import bentoml

bentoml_path = os.path.dirname(os.path.abspath(__file__))
project_path = os.path.split(os.path.split(bentoml_path)[0])[0]
sys.path.insert(0, project_path)

# Hard-coded at the moment in bentoml_utils.py for testing purposes
MODEL_NAME = 'mlflow_model'

minivess_runner = bentoml.mlflow.get(f'{MODEL_NAME}:latest').to_runner()

svc = bentoml.Service('minivess-segmentor', runners=[ minivess_runner ])

input_spec = bentoml.io.NumpyNdarray(
    dtype="float32",
    shape=(-1, 1, -1, -1, -1),
    enforce_shape=True,
    enforce_dtype=True,
)

@svc.api(input=input_spec, output=bentoml.io.NumpyNdarray())
def predict(input_arr):
    return minivess_runner.predict.run(input_arr)

bentofile.yaml

# https://docs.bentoml.org/en/latest/concepts/bento.html#build-a-bento
service: "service:svc"
description: "file: ./README.md"
# https://docs.bentoml.org/en/latest/guides/containerization.html#custom-base-image
docker:
    base_image: "petteriteikari/minivess-mlops-env:latest"
labels:
    owner: petteri
    stage: dev
include:
  # https://docs.bentoml.org/en/latest/concepts/bento.html#files-to-include
  - "**/*.py"
exclude:
  - "data/"
#conda:
#  environment_yml: "./conda.yaml"
#python:
#  requirements_txt: "../requirements.txt"
models: # The model to be used for building the Bento.
  - mlflow_model:latest
petteriTeikari commented 12 months ago

See example of the paths (when putting bentofile.yaml to project root): https://github.com/ocelotconsulting/customer_segmentation (and the writeup)

image