ymcui / Chinese-LLaMA-Alpaca-2

中文LLaMA-2 & Alpaca-2大模型二期项目 + 64K超长上下文模型 (Chinese LLaMA-2 & Alpaca-2 LLMs with 64K long context models)
Apache License 2.0
7.04k stars 581 forks source link

Update the code to support the latest dependency #393

Closed iMountTai closed 9 months ago

iMountTai commented 10 months ago

Description

The PR has the following changes:

  1. Support the latest dependencies. Currently supported scripts: training, inference, longbench,langchain……
  2. Use Transformers native flash-attn2 support when training, by setting --use_flash_attention_2 True.
  3. Set --full_finetuning True to support full-parameter training.

Related Issue

353 #390 #385

Explanation of Changes

copilot:walkthrough