Closed Erland366 closed 1 week ago
confirmed mistral working (gguf converstion take ages!)
Here's example colab to try : https://colab.research.google.com/drive/1Ac3rwXoNYGeS8xnBri4k6oapyeuH7ui0?usp=sharing
unsloth/Llama-3.2-1B-Instruct-bnb-4bit
confirmed working
@Erland366 I think you added a function to move it to /tmp
in Kaggle I think - is there a way to force it to use /tmp
for all Kaggle machines?
Fix this issue by adding environment variables at the start of importing unsloth.
Tried adding it inside the
save_gguf
but it doesn't work .-.Where did I find the solution -> https://github.com/protocolbuffers/protobuf/issues/3002#issuecomment-325459597
Will evaluate on other model first, then will open the PR
Need to create separate PR for testing on Kaggle since even
saving_to_gguf
doesn't work on Kaggle because of limited space (we haven't moved that function to/tmp
)