ymcui / Chinese-LLaMA-Alpaca

中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)
https://github.com/ymcui/Chinese-LLaMA-Alpaca/wiki
Apache License 2.0
17.98k stars 1.84k forks source link

合并 lora 模型与原版 LLaMA 模型,合并文件夹里缺少部分文件,如config文件等 #871

Closed Luka0770 closed 6 months ago

Luka0770 commented 7 months ago

提交前必须检查以下项目

问题类型

模型转换和合并

基础模型

LLaMA-7B

操作系统

Linux

详细描述问题

# 请在此处粘贴运行代码(如没有可删除该代码块)

依赖情况(代码类问题务必提供)

# 请在此处粘贴依赖情况

运行日志或截图

Base model: /model/LLaMA_7B_merge_hf LoRA model(s) ['/model/chinese_llama_plus_lora_7b', '/model/chinese_alpaca_plus_lora_7b']: Loading checkpoint shards: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2/2 [00:08<00:00, 4.16s/it] Peft version: 0.2.0 Loading LoRA for 7B model Loading LoRA /model/chinese_llama_plus_lora_7b... base_model vocab size: 32000 tokenizer vocab size: 49953 Extended vocabulary size to 49953 Loading LoRA weights merging base_model.model.model.embed_tokens.weight merging base_model.model.lm_head.weight merging base_model.model.model.layers.0.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.0.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.0.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.0.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.0.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.0.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.0.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.1.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.1.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.1.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.1.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.1.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.1.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.1.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.2.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.2.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.2.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.2.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.2.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.2.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.2.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.3.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.3.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.3.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.3.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.3.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.3.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.3.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.4.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.4.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.4.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.4.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.4.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.4.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.4.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.5.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.5.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.5.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.5.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.5.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.5.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.5.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.6.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.6.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.6.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.6.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.6.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.6.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.6.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.7.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.7.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.7.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.7.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.7.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.7.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.7.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.8.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.8.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.8.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.8.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.8.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.8.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.8.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.9.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.9.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.9.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.9.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.9.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.9.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.9.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.10.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.10.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.10.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.10.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.10.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.10.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.10.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.11.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.11.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.11.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.11.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.11.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.11.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.11.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.12.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.12.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.12.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.12.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.12.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.12.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.12.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.13.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.13.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.13.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.13.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.13.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.13.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.13.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.14.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.14.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.14.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.14.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.14.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.14.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.14.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.15.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.15.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.15.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.15.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.15.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.15.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.15.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.16.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.16.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.16.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.16.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.16.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.16.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.16.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.17.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.17.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.17.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.17.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.17.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.17.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.17.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.18.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.18.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.18.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.18.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.18.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.18.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.18.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.19.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.19.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.19.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.19.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.19.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.19.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.19.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.20.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.20.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.20.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.20.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.20.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.20.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.20.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.21.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.21.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.21.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.21.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.21.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.21.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.21.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.22.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.22.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.22.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.22.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.22.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.22.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.22.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.23.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.23.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.23.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.23.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.23.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.23.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.23.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.24.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.24.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.24.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.24.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.24.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.24.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.24.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.25.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.25.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.25.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.25.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.25.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.25.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.25.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.26.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.26.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.26.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.26.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.26.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.26.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.26.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.27.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.27.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.27.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.27.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.27.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.27.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.27.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.28.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.28.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.28.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.28.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.28.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.28.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.28.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.29.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.29.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.29.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.29.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.29.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.29.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.29.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.30.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.30.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.30.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.30.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.30.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.30.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.30.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.31.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.31.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.31.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.31.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.31.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.31.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.31.mlp.up_proj.lora_A.weight Loading LoRA /model/chinese_alpaca_plus_lora_7b... base_model vocab size: 49953 tokenizer vocab size: 49954 Extended vocabulary size to 49954 Loading LoRA weights merging base_model.model.model.embed_tokens.weight merging base_model.model.lm_head.weight merging base_model.model.model.layers.0.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.0.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.0.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.0.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.0.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.0.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.0.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.1.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.1.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.1.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.1.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.1.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.1.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.1.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.2.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.2.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.2.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.2.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.2.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.2.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.2.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.3.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.3.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.3.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.3.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.3.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.3.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.3.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.4.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.4.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.4.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.4.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.4.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.4.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.4.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.5.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.5.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.5.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.5.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.5.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.5.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.5.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.6.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.6.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.6.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.6.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.6.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.6.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.6.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.7.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.7.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.7.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.7.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.7.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.7.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.7.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.8.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.8.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.8.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.8.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.8.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.8.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.8.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.9.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.9.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.9.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.9.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.9.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.9.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.9.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.10.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.10.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.10.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.10.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.10.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.10.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.10.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.11.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.11.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.11.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.11.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.11.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.11.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.11.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.12.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.12.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.12.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.12.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.12.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.12.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.12.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.13.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.13.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.13.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.13.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.13.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.13.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.13.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.14.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.14.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.14.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.14.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.14.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.14.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.14.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.15.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.15.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.15.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.15.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.15.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.15.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.15.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.16.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.16.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.16.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.16.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.16.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.16.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.16.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.17.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.17.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.17.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.17.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.17.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.17.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.17.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.18.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.18.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.18.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.18.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.18.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.18.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.18.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.19.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.19.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.19.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.19.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.19.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.19.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.19.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.20.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.20.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.20.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.20.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.20.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.20.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.20.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.21.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.21.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.21.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.21.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.21.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.21.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.21.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.22.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.22.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.22.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.22.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.22.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.22.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.22.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.23.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.23.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.23.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.23.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.23.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.23.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.23.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.24.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.24.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.24.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.24.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.24.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.24.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.24.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.25.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.25.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.25.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.25.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.25.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.25.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.25.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.26.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.26.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.26.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.26.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.26.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.26.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.26.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.27.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.27.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.27.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.27.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.27.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.27.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.27.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.28.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.28.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.28.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.28.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.28.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.28.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.28.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.29.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.29.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.29.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.29.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.29.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.29.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.29.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.30.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.30.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.30.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.30.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.30.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.30.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.30.mlp.up_proj.lora_A.weight merging base_model.model.model.layers.31.self_attn.q_proj.lora_A.weight merging base_model.model.model.layers.31.self_attn.k_proj.lora_A.weight merging base_model.model.model.layers.31.self_attn.v_proj.lora_A.weight merging base_model.model.model.layers.31.self_attn.o_proj.lora_A.weight merging base_model.model.model.layers.31.mlp.gate_proj.lora_A.weight merging base_model.model.model.layers.31.mlp.down_proj.lora_A.weight merging base_model.model.model.layers.31.mlp.up_proj.lora_A.weight Saving to pth format... Saving shard 1 of 1 into /model/7B_full_model/consolidated.00.pth c77bd5c4a523c86ad5dcf5f4517cb93

ymcui commented 7 months ago

没看明白,你说是缺什么文件?config.json? params.json就是配置文件。你转换的又不是HuggingFace格式的。

github-actions[bot] commented 7 months ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your consideration.

github-actions[bot] commented 6 months ago

Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance.