bentoml / BentoML

The easiest way to serve AI apps and models - Build Model Inference APIs, Job queues, LLM apps, Multi-model pipelines, and more!
https://bentoml.com
Apache License 2.0
7.13k stars 791 forks source link

YOLOv5 Torch Hub Model No module named models #2386

Closed sarmientoj24 closed 2 years ago

sarmientoj24 commented 2 years ago

Getting this error when using a custom YOLOv5 model I trained that I loaded and saved using torch hub.

Error

anaconda3/envs/yolov5/lib/python3.7/site-packages/torch/serialization.py │
           │ :875 in find_class                                                                   │
           │                                                                                      │
           │   872 │   │   # This is useful for maintaining BC if we change a module path that te │
           │       instantiation relies on.                                                       │
           │   873 │   │   def find_class(self, mod_name, name):                                  │
           │   874 │   │   │   mod_name = load_module_mapping.get(mod_name, mod_name)             │
           │ ❱ 875 │   │   │   return super().find_class(mod_name, name)                          │
           │   876 │                                                                              │
           │   877 │   # Load the data (which may in turn use `persistent_load` to load tensors)  │
           │   878 │   data_file = io.BytesIO(zip_file.get_record(pickle_file))  |
ModuleNotFoundError: No module named 'models'  

Packaging the model

model = torch.hub.load("ultralytics/yolov5", 'custom', path='./my_model.pt', autoshape=False, force_reload=True)
tag = bentoml.pytorch.save("yolov5", model)

Friday, 01 April, 2022 08:06:24 PM  INFO     [cli] Successfully saved                        
                                             Model(tag="yolov5:e3e5ljfrwsq6ig3j", path="/home
                                             /james/bentoml/models/yolov5/e3e5ljfrwsq6ig3j/")

Loading the model

runner = bentoml.pytorch.load_runner("yolov5:latest")
service = bentoml.Service("yolov5", runners=[runner])

@service.api(input=JSON(), output=JSON())
def predict(input_dict):
    # Retrieve data from dictionary
    filename = input_dict['filename']
    b64_encoding = input_dict['image_content']

    # Convert base64 string to numpy image
    img = convert_b64_str_to_np(b64_encoding, filename)

    # Convert to Tensor
    img = transform(img)

    # Forward prop
    predictions = runner.run_batch(img.unsqueeze(0))

    # Get boxes
    outputs = non_max_suppression(predictions)
    outputs = outputs[0]
    outputs = [convert_predictions(output.numpy()) for output in outputs]

    json_output = {"num_preds": len(outputs), "bboxes": outputs, "filename": filename}

    return input_dict
timliubentoml commented 2 years ago

Thanks! Yes, I'm able to reproduce this issue with the following code. We will be looking at it

import torch
import bentoml

model = torch.hub.load('ultralytics/yolov5', 'yolov5s', pretrained=True)
tag = bentoml.pytorch.save("yolov5", model)
model = bentoml.pytorch.load("yolov5:latest")

throws exception: ModuleNotFoundError: No module named 'models'

timliubentoml commented 2 years ago

hi @sarmientoj24! Not sure if you got through this issue or not, but I think @larme looked at this and we were able to get the fix in this latest release. With a7, I was not able able to reproduce the issue anymore. Can you confirm?

timliubentoml commented 2 years ago

@sarmientoj24 Were you able to confirm if this works?

sarmientoj24 commented 2 years ago

I actually haven't tried this since we veered away from using Torch Hub models into just using ONNX

seyeonseanpark commented 2 years ago

I'm facing the same issue

aarnphm commented 2 years ago

@seyeonseanpark can you try this out https://github.com/bentoml/BentoML/issues/2602#issuecomment-1165259006

seyeonseanpark commented 2 years ago

thank you! will this be supported in later releases?

aarnphm commented 2 years ago

We need some more rounds of discussion within the team about supporting torch hub. But would love to hear ideas from the community as well.

All progress now be tracked at #2704