openvinotoolkit / anomalib

An anomaly detection library comprising state-of-the-art algorithms and features such as experiment management, hyper-parameter optimization, and edge inference.
https://anomalib.readthedocs.io/en/latest/
Apache License 2.0
3.79k stars 674 forks source link

Inference Bug - Too many variables to unpack #56

Closed marvision-ai closed 2 years ago

marvision-ai commented 2 years ago

Hello, thank you for the great repo!

Describe the bug I cannot run inference.

To Reproduce Steps to reproduce the behavior: conda install openvino-ie4py-ubuntu20 -c intel conda install pytorch==1.8.0 torchvision==0.9.0 torchaudio==0.8.0 cudatoolkit=11.1 -c pytorch -c conda-forge

python tools/inference.py \
    --model_config_path anomalib/models/padim/config.yaml \
    --weight_path results/padim/mvtec/leather/weights/model.ckpt \
    --image_path datasets/MVTec/leather/test/color/000.png
Traceback (most recent call last):
  File "tools/inference.py", line 90, in <module>
    infer()
  File "tools/inference.py", line 78, in infer
    output = inference.predict(image=args.image_path, superimpose=True)
  File "anomalib/anomalib/core/model/inference.py", line 90, in predict
    anomaly_map, pred_score = self.post_process(predictions, meta_data=meta_data)
ValueError: too many values to unpack (expected 2)

Hardware and Software Configuration

Additional context I also noticed that when I do inference from a ckpt file it still imports openvino- i expected this not to since it should run purely in pytorch.

samet-akcay commented 2 years ago

Thanks for spotting this @marvision-ai! There have been some changes in the OpenVinoInferencer. These changes should also be applied to TorchInferencer in post_process method. Any thoughts @djdameln, @ashwinvaidya17?

ashwinvaidya17 commented 2 years ago

@samet-akcay I have fixed this in PR https://github.com/openvinotoolkit/anomalib/pull/17 https://github.com/openvinotoolkit/anomalib/blob/feature/ashwin/benchmarking_tools/anomalib/core/model/inference.py#L185

xxl007 commented 2 years ago

Thanks for the great repo. Reporting the same issue.

marvision-ai commented 2 years ago

Thank you @samet-akcay and @ashwinvaidya17 for very fast turn around time!

Is there a reason why when I want to load and infer with a ckpt file it still needs to import openvino? The documentation states that purely for Pytorch.

If the specified weight path points to a PyTorch Lightning checkpoint file (.ckpt), inference will run in PyTorch. If the path points to an ONNX graph (.onnx) or OpenVINO IR (.bin or .xml), inference will run in OpenVINO.