Open dongfangduoshou123 opened 3 weeks ago
按照介绍的,用vllm启动的时候会报错,起不来啊
You can try to add “architectures”:["TeleChatForCausalLM"] in config.json, and fix telechat.py L97 self.total_num_kv_heads=config.num_attention_heads
按照介绍的,用vllm启动的时候会报错,起不来啊