BlinkDL / RWKV-LM

RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
Apache License 2.0
12.05k stars 827 forks source link

Lora微调灾难性遗忘 #141

Closed reckzhou closed 1 year ago

reckzhou commented 1 year ago

完全采用垂直领域的样本,结果忘记通用领域的所有技能。 不确定在这个模型中如何解决这个问题?

Forget all the skills in the general domain with my samples. How to fix this in this model?

Triang-jyed-driung commented 1 year ago

去这里https://github.com/Blealtan/RWKV-LM-LoRA/issues 提问吧

BlinkDL commented 1 year ago

加一些通用语料,设置正确的学习速率,不要学太多轮

reckzhou commented 1 year ago

感谢