yasenh / libtorch-yolov5

A LibTorch inference implementation of the yolov5
MIT License
372 stars 114 forks source link

Does anyone run GPU inference successfully? #19

Closed Jelly123456 closed 3 years ago

Jelly123456 commented 3 years ago

I could not run the inference with GPU enabled. I follow the instructions to modify the export.py code to export the torchscript model with GPU, but when inferring with libtorch, it cannot load the weight.

Does anyone know how to solve it?

My OS is windows 10 and it is able to run the CPU torchscript model.

Thanks in advance.

yasenh commented 3 years ago

Hi @Jelly123456 , what is the error message? I tested few days ago (in Ubuntu) and it works fine. What is your pytorch version and did you pull the latest yolov5 python version?

Jelly123456 commented 3 years ago

@yasenh Thanks very much for your reply. There is no error message. This is the output when running with GPU:

image

pytorch version is V1.6 and I am using the latest yolov5 version.

yasenh commented 3 years ago

@Jelly123456 I tested locally without any issue, it seems crashed during "module_.forward(inputs)", not sure if it is related to Windows.. Maybe you can follow PyTorch official example and make sure it works on you GPU

yasenh commented 3 years ago

Feel free to share your experience

Jelly123456 commented 3 years ago

@yasenh Thanks very much for your testing.

I did some internet search and found that this could be the bug for libtorch in Windows. https://github.com/pytorch/pytorch/issues/23217 https://github.com/yf225/pytorch-cpp-issue-tracker/issues/378