InternLM / xtuner

An efficient, flexible and full-featured toolkit for fine-tuning LLM (InternLM2, Llama3, Phi3, Qwen, Mistral, ...)
https://xtuner.readthedocs.io/zh-cn/latest/
Apache License 2.0
3.81k stars 302 forks source link

No module named 'xtuner.parallel' #541

Closed 828Tina closed 5 months ago

828Tina commented 6 months ago

环境按照步骤安装:conda create --name xtuner_lxy python=3.10 -y pip install -U xtuner 可以正常使用xtuner命令检测数据集:xtuner check-custom-dataset /home/xtuner/internlm2_7b_qlora_alpaca_e3_copy.py

7398628bbf8a9e39f60e2e971290a36

微调时出错:xtuner train internlm2_7b_qlora_alpaca_e3_copy.py --deepspeed deepspeed_zero2

[2024-04-02 11:50:21,749] [INFO] [real_accelerator.py:191:get_accelerator] Setting ds_accelerator to cuda (auto detect) 04/02 11:50:24 - mmengine - WARNING - WARNING: command error: 'No module named 'xtuner.parallel''! 04/02 11:50:24 - mmengine - WARNING - Arguments received: ['xtuner', 'train', 'internlm2_7b_qlora_alpaca_e3_copy.py', '--deepspeed', 'deepspeed_zero2']. xtuner commands use the following syntax:

      xtuner MODE MODE_ARGS ARGS

      Where   MODE (required) is one of ('list-cfg', 'copy-cfg', 'log-dataset', 'check-custom-dataset', 'train', 'test', 'chat', 'convert', 'preprocess', 'mmbench', 'eval_refcoco')
              MODE_ARG (optional) is the argument for specific mode
              ARGS (optional) are the arguments for specific command

  Some usages for xtuner commands: (See more by using -h for specific command!)
LZHgrla commented 6 months ago

Hi! @828Tina So sorry for that. This is a known issue about xtuner v0.1.16, and we are working on it!

Before the bug is resolved, you can degrade xtuner to 0.1.15, or install xtuner from source.

# Approach 1: degrade
pip install xtuner==0.1.15
# Approach 2: install from source
git clone https://github.com/InternLM/xtuner.git
cd xtuner
pip install -e .
ILG2021 commented 6 months ago

0.1.15也有此问题

LZHgrla commented 6 months ago

@ILG2021 是不是使用0.1.16的config?如果是的话,需要把 config 在0.1.15版本下重新copy一份

LZHgrla commented 6 months ago

@ILG2021 @828Tina Install xtuner>=0.1.17 can solve this issue.

pip install 'xtuner>=0.1.17'
ILG2021 commented 6 months ago

好的,谢谢。 我还想问下xtuner能否在windows上使用qlora,因为bitsandbytes不支持Windows,直接使用pip install bitsandbytes会安装一个不支持GPU的版本,折腾了一下午也没有解决。

LZHgrla commented 6 months ago

@ILG2021 是的 官方的bitsandbytes不支持windows。可以关注一下这个 issue https://github.com/TimDettmers/bitsandbytes/issues/30

里面有人通过这个 repo 解决了问题,不过我们并未进行验证 https://github.com/jllllll/bitsandbytes-windows-webui