THUDM / ChatGLM-6B

ChatGLM-6B: An Open Bilingual Dialogue Language Model | 开源双语对话语言模型
Apache License 2.0
40.47k stars 5.19k forks source link

[Help] <title>微调后没有学习到新内容 #1412

Open SSQiana opened 11 months ago

SSQiana commented 11 months ago

Is there an existing issue for this?

Current Behavior

用自己创建的数据集训练了1300条数据,可以正常对话,但是几乎什么都没有学习到,只能回答原始的通识问题。 为了避免灾难性遗忘,我把lr改为了1e-4 参数设置: PRE_SEQ_LEN=128 LR=1e-4

CUDA_VISIBLE_DEVICES=0 python main.py \ --do_train \ --train_file AdvertiseGen/train.json \ --validation_file AdvertiseGen/dev.json \ --prompt_column content \ --response_column summary \ --overwrite_cache \ --model_name_or_path THUDM/chatglm-6b \ --output_dir output/adgen-chatglm-6b-pt-$PRE_SEQ_LEN-$LR \ --overwrite_output_dir \ --max_source_length 700 \ --max_target_length 64 \ --per_device_train_batch_size 1 \ --per_device_eval_batch_size 1 \ --gradient_accumulation_steps 16 \ --predict_with_generate \ --max_steps 3000 \ --logging_steps 10 \ --save_steps 500 \ --learning_rate $LR \ --pre_seq_len $PRE_SEQ_LEN \

Expected Behavior

Steps To Reproduce

Environment

- OS:
- Python:
- Transformers:

Anything else?

CaoYongshengcys commented 10 months ago

我也是,微调后没学到什么