ymcui / Chinese-LLaMA-Alpaca-2

中文LLaMA-2 & Alpaca-2大模型二期项目 + 64K超长上下文模型 (Chinese LLaMA-2 & Alpaca-2 LLMs with 64K long context models)
Apache License 2.0
7.04k stars 581 forks source link

求助,如何训练模型能支持更大的上下文 #384

Closed zhengzhanpeng closed 9 months ago

zhengzhanpeng commented 10 months ago

Check before submitting issues

Type of Issue

Model quantization and deployment

Base Model

Others

Operating System

macOS

Describe your issue in detail

# Please copy-and-paste your command here.

no command

Dependencies (must be provided for code-related issues)

# Please copy-and-paste your dependencies here.

no dependencies

Execution logs or screenshots

# Please copy-and-paste your logs here.

我有一个模型是基于llama2微调的,我想对它的上下文进行扩展,急需扩展至16k,希望您能提供相关文档或教程帮助我一下。不胜感激!!

iMountTai commented 10 months ago

参考release 3.0

zhengzhanpeng commented 10 months ago

参考release 3.0

您好,这里虽然有说明是用NTK方法实现的,但是没有说明可以具体怎么做来实现呢,希望能那里有一个教程来指引。

github-actions[bot] commented 9 months ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your consideration.

github-actions[bot] commented 9 months ago

Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance.