Open lifelongeeek opened 1 month ago
Hi @lifelongeeek Thanks for reporting the issue.
I have reproduced this problem and I will figure out the solution.
Could you workaround it by setting groupsize to 128 or 256? for now groupsize must be less than 2048
Thanks for sharing work for LLM quantization & onnx export.
I follow the script in 'Convert to onnx model' section, and got following error below. Do you know any possible reason?