ymcui / Chinese-LLaMA-Alpaca-2

中文LLaMA-2 & Alpaca-2大模型二期项目 + 64K超长上下文模型 (Chinese LLaMA-2 & Alpaca-2 LLMs with 64K long context models)
Apache License 2.0
7.01k stars 571 forks source link

Update requirements #467

Closed iMountTai closed 7 months ago

iMountTai commented 7 months ago

Description

This PR fixes the default format for saving weights in the latest version of peft.

Related Issue

464

Explanation of Changes

copilot:walkthrough