ymcui / Chinese-LLaMA-Alpaca-2

中文LLaMA-2 & Alpaca-2大模型二期项目 + 64K超长上下文模型 (Chinese LLaMA-2 & Alpaca-2 LLMs with 64K long context models)
Apache License 2.0
7.01k stars 571 forks source link

二次预训练的blocksize和指令微调的blocksize应该一致吗? #466

Closed AIFFFENG closed 6 months ago

AIFFFENG commented 7 months ago

提交前必须检查以下项目

问题类型

模型训练与精调

基础模型

None

操作系统

Linux

详细描述问题

# 请在此处粘贴运行代码(请粘贴在本代码块里)

依赖情况(代码类问题务必提供)

# 请在此处粘贴依赖情况(请粘贴在本代码块里)

运行日志或截图

# 请在此处粘贴运行日志(请粘贴在本代码块里)
iMountTai commented 7 months ago

没有明确的要求,只要不大于模型原生支持长度即可。

github-actions[bot] commented 6 months ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your consideration.