Closed 828Tina closed 5 months ago
Hi! @828Tina So sorry for that. This is a known issue about xtuner v0.1.16, and we are working on it!
Before the bug is resolved, you can degrade xtuner to 0.1.15, or install xtuner from source.
# Approach 1: degrade
pip install xtuner==0.1.15
# Approach 2: install from source
git clone https://github.com/InternLM/xtuner.git
cd xtuner
pip install -e .
0.1.15也有此问题
@ILG2021 是不是使用0.1.16的config?如果是的话,需要把 config 在0.1.15版本下重新copy一份
@ILG2021 @828Tina Install xtuner>=0.1.17 can solve this issue.
pip install 'xtuner>=0.1.17'
好的,谢谢。 我还想问下xtuner能否在windows上使用qlora,因为bitsandbytes不支持Windows,直接使用pip install bitsandbytes会安装一个不支持GPU的版本,折腾了一下午也没有解决。
@ILG2021 是的 官方的bitsandbytes不支持windows。可以关注一下这个 issue https://github.com/TimDettmers/bitsandbytes/issues/30
里面有人通过这个 repo 解决了问题,不过我们并未进行验证 https://github.com/jllllll/bitsandbytes-windows-webui
环境按照步骤安装:conda create --name xtuner_lxy python=3.10 -y pip install -U xtuner 可以正常使用xtuner命令检测数据集:xtuner check-custom-dataset /home/xtuner/internlm2_7b_qlora_alpaca_e3_copy.py
微调时出错:xtuner train internlm2_7b_qlora_alpaca_e3_copy.py --deepspeed deepspeed_zero2
[2024-04-02 11:50:21,749] [INFO] [real_accelerator.py:191:get_accelerator] Setting ds_accelerator to cuda (auto detect) 04/02 11:50:24 - mmengine - WARNING - WARNING: command error: 'No module named 'xtuner.parallel''! 04/02 11:50:24 - mmengine - WARNING - Arguments received: ['xtuner', 'train', 'internlm2_7b_qlora_alpaca_e3_copy.py', '--deepspeed', 'deepspeed_zero2']. xtuner commands use the following syntax: