CVI-SZU / Linly

Chinese-LLaMA 1&2、Chinese-Falcon 基础模型;ChatFlow中文对话模型;中文OpenLLaMA模型;NLP预训练/指令微调数据集
3.03k stars 235 forks source link

关于33B模型预训练语料长度 #96

Open minlik opened 1 year ago

minlik commented 1 year ago

你好,注意到33B模型,预训练语料的长度是512。如果在这个基础上进行继续指令finetune,finetune数据长度大于512的话,会有影响吗?

parkLGW commented 1 year ago

请问您现在对这个问题有答案了吗

ydli-ai commented 1 year ago

没有影响


发件人: parkLGW @.> 发送时间: Friday, July 21, 2023 5:22:07 PM 收件人: CVI-SZU/Linly @.> 抄送: Subscribed @.***> 主题: Re: [CVI-SZU/Linly] 关于33B模型预训练语料长度 (Issue #96)

请问您现在对这个问题有答案了吗

― Reply to this email directly, view it on GitHubhttps://github.com/CVI-SZU/Linly/issues/96#issuecomment-1645276057, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AE3SPV4MQJPWOR4UMXW5TE3XRJC37ANCNFSM6AAAAAAYWX4X2U. You are receiving this because you are subscribed to this thread.Message ID: @.***>