nlpxucan / WizardLM

LLMs build upon Evol Insturct: WizardLM, WizardCoder, WizardMath
9.06k stars 709 forks source link

quantity with auto_gptq avg loss: nan #237

Open hgcdanniel opened 5 months ago

hgcdanniel commented 5 months ago

hi,when I try to use https://github.com/PanQiWei/AutoGPTQ to quantity the model finetune with startcoder, the Quantizing attn.c_attn in layer 1/40...2/40..... are all avg loss: nan, no mather 8/4/bit ,could you share your quantity script??

thanks