Open kuki2008 opened 2 weeks ago
I doesn't look like there's an error (?) Seems to be in the middle of downloading adapter_model.bin
when it's interrupted
any ideas why is it interrupting?
Hm potentially the notebook is timing out, not sure but might be on colab side.
okay, maybe i should try running it on my local machine, but i dont have Nvidia GPU, so i have a question: Will it run without CUDA?
Hm worth a shot, nothing in library is CUDA specific but totally possible pytorch issues pop up due to not having it.
Traceback (most recent call last):
File "c:\Users\Kuki\Documents\VS-Projects\python\newra_7.1\multi_token\scripts\serve_model.py", line 32, in <module>
model, tokenizer = load_trained_lora_model(
^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\Kuki\Documents\VS-Projects\python\newra_7.1\multi_token\scripts\multi_token\inference.py", line 52, in load_trained_lora_model
model = model_cls.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Kuki\AppData\Local\Programs\Python\Python311\Lib\site-packages\transformers\modeling_utils.py", line 3030, in from_pretrained
raise RuntimeError("No GPU found. A GPU is needed for quantization.")
RuntimeError: No GPU found. A GPU is needed for quantization.
Looks like i need to have GPU
So, i was trying to run this in google colab:
And then i got this:
Here is the full log:
@sshh12