Closed Naugustogi closed 1 year ago
I had the same issue. The quickstart seems to install the cpu only version of pytorch by default. You need the cuda-enabled version of pytorch. Use pip/conda to uninstall the version of pytorch you have, then install the cuda version using the instructions here.
Before downloading, double-check what version of cuda you have installed so you pick the right torch version. You can do this by running nvcc --version
from the command line.
Good luck!
I have the same issue with a MacBook Pro with an AMD graphic card. I don't think installing a Cuda-enabled version of PyTorch is an option in my case.
I had the same issue. The quickstart seems to install the cpu only version of pytorch by default. You need the cuda-enabled version of pytorch. Use pip/conda to uninstall the version of pytorch you have, then install the cuda version using the instructions here.
Before downloading, double-check what version of cuda you have installed so you pick the right torch version. You can do this by running
nvcc --version
from the command line.Good luck!
Cuda 11.7 with gpu is already installed, i could use an anaconda environment but i don't have much experience in that, still it doesn't work,
Has anyone found a workaround to this?
I had the same issue. The quickstart seems to install the cpu only version of pytorch by default. You need the cuda-enabled version of pytorch. Use pip/conda to uninstall the version of pytorch you have, then install the cuda version using the instructions here. Before downloading, double-check what version of cuda you have installed so you pick the right torch version. You can do this by running
nvcc --version
from the command line. Good luck!Cuda 11.7 with gpu is already installed, i could use an anaconda environment but i don't have much experience in that, still it doesn't work,
Is the CPU-only version also installed? If so, try uninstalling it. Otherwise, it sounds like an environment issue and I would make a new conda/venv environment. Both are relatively easy to get set up, here's a good place to start.
Has anyone found a workaround to this?
another attempt with the huggingface transformer worked, maybe abit complicated also had to use a cpu version of a package
Hi @Naugustogi, can you check if you still experience the issues with galai version 1.1.0? You should be able to use the model on CPU with load_model(..., num_gpus=0)
.
num_gpus=0)
doesn't work either
AssertionError: Torch not compiled with CUDA enabled
@Naugustogi any chance you can provide the full stack trace?
@Naugustogi any chance you can provide the full stack trace? it happened after i started the program normally with inference
import galai as gal model = gal.load_model(name = 'mini',num_gpus=0) model.generate("Scaled dot product attention:\n\n\[")
i just use the cpu version
┌─────────────────────────────── Traceback (most recent call last) ────────────────────────────────┐
│ F:\galai-1.0.0\start.py:2 in
Thanks @Naugustogi. The traceback shows galai
1.0.0. Can you try with 1.1.2?
Thanks @Naugustogi. The traceback shows
galai
1.0.0. Can you try with 1.1.2?
i'm not sure where to get that, in this repo, its just version 1.0.0 (3 weeks ago)
@Naugustogi You can install it with pip
or clone the main git branch (currently at 1.1.2, you can verify by inspecting the setup.py file in your installation).
@Naugustogi
alright, 1.1.2 doesn't work either, it won't even show me any error, after starting, it returns the main folder
it returns the main folder
what do you mean? If you are running it as a script, you need to wrap the last line in print()
.
what do you mean? If you are running it as a script, you need to wrap the last line in
print()
.
ok it worked
if there isn't anything special, the normal quickstart install doesn't work.