Closed waveng closed 1 year ago
Hello. Is it convenient to make a CPU inference? I want to do an inference test on the CPU,Because I don't have hardware like a GPU
Sure, you can remove the GPU (CUDA) during model loading and inference. remove "model = model.cuda()" "img=img.cuda().float() / 255.0" -> "img=img / 255.0"
Hello. Is it convenient to make a CPU inference? I want to do an inference test on the CPU,Because I don't have hardware like a GPU