Alibaba-MIIL / ML_Decoder

Official PyTorch implementation of "ML-Decoder: Scalable and Versatile Classification Head" (2021)
MIT License
317 stars 53 forks source link

Issue during inference #71

Open Debjyoti-Adhikary opened 1 year ago

Debjyoti-Adhikary commented 1 year ago

I tried to run the infer.py, after installing all the dependencies. I used the model "tresnet_l_COCO__448_90_0.pth" with 448 image size (default value) over the sample image provided with the code. Due to some reason, I am not getting any classes in the output. It turns out that the model is returning a vector of "nan" values. Please let me know if I am missing anything here.

yangyangtiaoguo commented 1 year ago

The inference example provided by the author does not seem to work. The loading parameters also do not match the model. I tried the example tresnet_l_COCO__448_90_0.pth.

idj3tboy commented 11 months ago

The inference example provided by the author does not seem to work. The loading parameters also do not match the model. I tried the example tresnet_l_COCO__448_90_0.pth.

Exactly! did you find any solution to it?

Sondosmohamed1 commented 11 months ago

i tested the inference code with Stanford and it did not work ? did anyone tested it