THUDM / ChatGLM2-6B

ChatGLM2-6B: An Open Bilingual Chat LLM | 开源双语对话语言模型
Other
15.73k stars 1.85k forks source link

[BUG/Help] <win10微调 失败:ValueError: None is not in list> #646

Open wangyingdong opened 11 months ago

wangyingdong commented 11 months ago

Is there an existing issue for this?

Current Behavior

微信截图_20231218141909 微信截图_20231218141846

执行的train.bat运行时出错了,提示 “ValueError: None is not in list”

Expected Behavior

No response

Steps To Reproduce

执行 train.bat

Environment

- OS:windows10
- Python: 3.10.9
- Transformers:4.27.1
- PyTorch:1.13.1+cu116
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :

Anything else?

train.bat 代码如下:

set PRE_SEQ_LEN=128
set LR=2e-2
set NUM_GPUS=1

D:\\Miniconda\\envs\\chat_base_py310s\\python.exe main.py ^
    --do_train ^
    --train_file AdvertiseGen\\train.json ^
    --validation_file AdvertiseGen\\dev.json ^
    --preprocessing_num_workers 10 ^
    --prompt_column content ^
    --response_column summary ^
    --overwrite_cache ^
    --model_name_or_path D:\\Dev\\Huggingface\\THUDM\\chatglm2-6b ^
    --output_dir output/adgen-chatglm2-6b-pt-%PRE_SEQ_LEN-%LR ^
    --overwrite_output_dir ^
    --max_source_length 64 ^
    --max_target_length 128 ^
    --per_device_train_batch_size 1 ^
    --per_device_eval_batch_size 1 ^
    --gradient_accumulation_steps 16 ^
    --predict_with_generate ^
    --max_steps 128 ^
    --logging_steps 4 ^
    --save_steps 128 ^
    --learning_rate %LR% ^
    --pre_seq_len %PRE_SEQ_LEN%
fyf2016 commented 10 months ago

Is there an existing issue for this?

  • [x] I have searched the existing issues

Current Behavior

微信截图_20231218141909 微信截图_20231218141846

执行的train.bat运行时出错了,提示 “ValueError: None is not in list”

Expected Behavior

No response

Steps To Reproduce

执行 train.bat

Environment

- OS:windows10
- Python: 3.10.9
- Transformers:4.27.1
- PyTorch:1.13.1+cu116
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :

Anything else?

train.bat 代码如下:

set PRE_SEQ_LEN=128
set LR=2e-2
set NUM_GPUS=1

D:\\Miniconda\\envs\\chat_base_py310s\\python.exe main.py ^
    --do_train ^
    --train_file AdvertiseGen\\train.json ^
    --validation_file AdvertiseGen\\dev.json ^
    --preprocessing_num_workers 10 ^
    --prompt_column content ^
    --response_column summary ^
    --overwrite_cache ^
    --model_name_or_path D:\\Dev\\Huggingface\\THUDM\\chatglm2-6b ^
    --output_dir output/adgen-chatglm2-6b-pt-%PRE_SEQ_LEN-%LR ^
    --overwrite_output_dir ^
    --max_source_length 64 ^
    --max_target_length 128 ^
    --per_device_train_batch_size 1 ^
    --per_device_eval_batch_size 1 ^
    --gradient_accumulation_steps 16 ^
    --predict_with_generate ^
    --max_steps 128 ^
    --logging_steps 4 ^
    --save_steps 128 ^
    --learning_rate %LR% ^
    --pre_seq_len %PRE_SEQ_LEN%

老哥你解决了吗?什么问题呀

Siqi-c commented 10 months ago

您好,请问您的问题解决了吗?我也遇到了这个问题。