ymcui / Chinese-LLaMA-Alpaca-2

中文LLaMA-2 & Alpaca-2大模型二期项目 + 64K超长上下文模型 (Chinese LLaMA-2 & Alpaca-2 LLMs with 64K long context models)
Apache License 2.0
7.01k stars 571 forks source link

请教一个问题,sft之后的pth如何转bin呢?可以给点建议吗 #495

Closed smallBy closed 5 months ago

smallBy commented 6 months ago

提交前必须检查以下项目

问题类型

None

基础模型

None

操作系统

None

详细描述问题

# 请在此处粘贴运行代码(请粘贴在本代码块里)

依赖情况(代码类问题务必提供)

# 请在此处粘贴依赖情况(请粘贴在本代码块里)

运行日志或截图

# 请在此处粘贴运行日志(请粘贴在本代码块里)
smallBy commented 6 months ago

训练指令如下: torchrun --nnodes 1 --nproc_per_node 8 run_clm_sft_with_peft.py \ --deepspeed ds_zero2_no_offload.json \ --model_name_or_path chinese-alpaca-2-13b/ \ --tokenizer_name_or_path scripts/tokenizer/ \ --dataset_dir data/ \ --per_device_train_batch_size 1 \ --per_device_eval_batch_size 1 \ --do_train \ --seed $RANDOM \ --fp16 \ --num_train_epochs 2 \ --lr_scheduler_type cosine \ --learning_rate 1e-4 \ --warmup_ratio 0.03 \ --weight_decay 0 \ --logging_strategy steps \ --logging_steps 10 \ --save_strategy steps \ --save_total_limit 3 \ --evaluation_strategy steps \ --eval_steps 250 \ --save_steps 500 \ --gradient_accumulation_steps 1 \ --preprocessing_num_workers 8 \ --max_seq_length 512 \ --output_dir output/ \ --overwrite_output_dir \ --ddp_timeout 30000 \ --logging_first_step True \ --lora_rank 64 \ --lora_alpha 128 \ --trainable "q_proj,v_proj,k_proj,o_proj,gate_proj,down_proj,up_proj" \ --modules_to_save "embed_tokens,lm_head" \ --lora_dropout 0.05 \ --torch_dtype float16 \ --save_safetensors False \ --validation_file data/validate.json \ --load_in_kbits 16 \ --save_safetensors False \ --gradient_checkpointing \ --ddp_find_unused_parameters False

smallBy commented 6 months ago

系统:ubuntu20.04 GPU为P100

ymcui commented 6 months ago

https://github.com/huggingface/transformers/blob/main/src/transformers/models/llama/convert_llama_weights_to_hf.py

github-actions[bot] commented 6 months ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your consideration.

github-actions[bot] commented 5 months ago

Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance.