nlpxucan / WizardLM

LLMs build upon Evol Insturct: WizardLM, WizardCoder, WizardMath
9.21k stars 715 forks source link

Why 30B delta is 122GB? #91

Open dittops opened 1 year ago

dittops commented 1 year ago

The base llama 30B is around 61GB, but the WizardLM delta is 122 GB. Any thought on this?

young-chao commented 1 year ago

I think it's because the original model was saved in fp16 and now it's saved in fp32。