Open JoesSattes opened 3 days ago
Thank you for the impressive work! On consumer GPUs, it’s tough to run larger models without quantization. Could you please provide quantized versions (7B, 70B)? Sharing these on Hugging Face would be greatly appreciated.
Thank you for the impressive work! On consumer GPUs, it’s tough to run larger models without quantization. Could you please provide quantized versions (7B, 70B)? Sharing these on Hugging Face would be greatly appreciated.