Closed blair0011 closed 1 year ago
Hi @blair0011, can you check with galai version 1.1.0? You can load the model to RAM with load_model(..., num_gpus=0)
.
Hi @mkardas, I could perfectly load the 'mini' setting num_gpus=0
. Thanks
I found the issue it was because WSL needed an update and conversion of my VM to version 2. Now everything works. Thanks
Please remove the CUDA requirement or export the model in a device agnostic format like ONNX. Some people just want to try out your models but their systems may not have a compatible Nvidia GPU.