datawhalechina / self-llm

《开源大模型食用指南》基于Linux环境快速部署开源大模型,更适合中国宝宝的部署教程
Apache License 2.0
6.08k stars 748 forks source link

Lora微调反向传播loss.backward()报错 #189

Open meteoryet opened 1 week ago

meteoryet commented 1 week ago

在使用Lora微调模型时,在执行Trainer中反向传播函数loss.backward()时会报错"element 0 of tensors does not require grad and does not have a grad_fn",查询相关教程有说在该该句前添加loss.requires_grad(True),这样做后确实不再报错,但参数也不再更新,请问有什么解决方案吗?

KMnO4-zx commented 1 week ago

你要注明是哪个模型,哪个教程,方便我们浮现,另外还要有报错的截图

meteoryet commented 1 week ago

问题截图: 1

KMnO4-zx commented 1 week ago

额,这不是本仓库的教程吧?建议去对应仓库提交issue