yuanzhoulvpi2017 / zero_nlp

中文nlp解决方案(大模型、数据、模型、训练、推理)
MIT License
2.81k stars 351 forks source link

救命!!ChatGlm-v2-6b_Lora该怎么设置epoch?? #160

Open fengzehui0422 opened 9 months ago

fengzehui0422 commented 9 months ago

export CUDA_VISIBLE_DEVICES=0 python main.py \ --do_train \ --train_file D:/LLM/yuanzhoulvpi/AdvertiseGen/train.json \ --validation_file D:/LLM/yuanzhoulvpi/AdvertiseGen/dev.json \ --preprocessing_num_workers 10 \ --prompt_column content \ --response_column summary \ --overwrite_cache \ --model_name_or_path chatglm2-6b_model \ --output_dir output/adgen-chatglm2-6b-lora_version \ --overwrite_output_dir \ --max_source_length 64 \ --max_target_length 128 \ --per_device_train_batch_size 1 \ --per_device_eval_batch_size 1 \ --gradient_accumulation_steps 16 \ --predict_with_generate \ --max_steps 3000 \ --logging_steps 10 \ --save_steps 100 \ --learning_rate 2e-5 \ --lora_r 32 \ --model_parallel_mode True 我看这里也没有epoch轮数,难道Lora只能训练一次?