chequanghuy / TwinLiteNet

MIT License
127 stars 32 forks source link

Is it convenient to make a CPU inference? #2

Closed waveng closed 1 year ago

waveng commented 1 year ago

Hello. Is it convenient to make a CPU inference? I want to do an inference test on the CPU,Because I don't have hardware like a GPU

chequanghuy commented 1 year ago

Sure, you can remove the GPU (CUDA) during model loading and inference. remove "model = model.cuda()" "img=img.cuda().float() / 255.0" -> "img=img / 255.0"