Tele-AI / TeleChat2

星辰语义大模型TeleChat2是由中国电信人工智能研究院研发训练的大语言模型,是首个完全国产算力训练并开源的千亿参数模型
158 stars 14 forks source link

vllm 0.6.4.post1支持 #15

Open To0nyZ opened 1 week ago

To0nyZ commented 1 week ago

在vllm/model_executor/models/加入了telechat.py并修改了已经消失的is_hip

(vllm相关变动https://github.com/vllm-project/vllm/commit/4e2d95e372ad5fbef7b27c66d527c37477c0c8bb#diff-e3f867c9588601222d74d39110d8e48cc43d5f6107436150faf1749a3d091419R211)

在vllm/model_executor/models/registry.py配置好"TeleChatForCausalLM": ("telechat", "TeleChatForCausalLM"), #telechat

依然报错了:

Traceback (most recent call last): File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/vllm/entrypoints/openai/api_server.py", line 197, in build_async_engine_client_from_engine_args engine_config = engine_args.create_engine_config() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/vllm/engine/arg_utils.py", line 959, in create_engine_config model_config = self.create_model_config() ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/vllm/engine/arg_utils.py", line 891, in create_model_config return ModelConfig( ^^^^^^^^^^^^ File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/vllm/config.py", line 264, in init supported_tasks, task = self._resolve_task(task, self.hf_config) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/vllm/config.py", line 347, in _resolve_task selected_task = next(iter(supported_tasks_lst)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ StopIteration

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "", line 198, in _run_module_as_main File "", line 88, in _run_code File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/vllm/entrypoints/openai/api_server.py", line 643, in uvloop.run(run_server(args)) File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/uvloop/init.py", line 105, in run return runner.run(wrapper()) ^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/textgen/lib/python3.11/asyncio/runners.py", line 118, in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/uvloop/init.py", line 61, in wrapper return await main ^^^^^^^^^^ File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/vllm/entrypoints/openai/api_server.py", line 609, in run_server async with build_async_engine_client(args) as engine_client: File "/root/miniconda3/envs/textgen/lib/python3.11/contextlib.py", line 210, in aenter return await anext(self.gen) ^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/vllm/entrypoints/openai/api_server.py", line 113, in build_async_engine_client async with build_async_engine_client_from_engine_args( File "/root/miniconda3/envs/textgen/lib/python3.11/contextlib.py", line 210, in aenter return await anext(self.gen) ^^^^^^^^^^^^^^^^^^^^^ RuntimeError: async generator raised StopIteration ERROR 11-21 11:05:16 engine.py:366] Traceback (most recent call last): File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/vllm/engine/multiprocessing/engine.py", line 357, in run_mp_engine engine = MQLLMEngine.from_engine_args(engine_args=engine_args, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/vllm/engine/multiprocessing/engine.py", line 114, in from_engine_args engine_config = engine_args.create_engine_config() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/vllm/engine/arg_utils.py", line 959, in create_engine_config model_config = self.create_model_config() ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/vllm/engine/arg_utils.py", line 891, in create_model_config return ModelConfig( ^^^^^^^^^^^^ File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/vllm/config.py", line 264, in init supported_tasks, task = self._resolve_task(task, self.hf_config) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/vllm/config.py", line 347, in _resolve_task selected_task = next(iter(supported_tasks_lst)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ StopIteration Process SpawnProcess-1: Traceback (most recent call last): File "/root/miniconda3/envs/textgen/lib/python3.11/multiprocessing/process.py", line 314, in _bootstrap self.run() File "/root/miniconda3/envs/textgen/lib/python3.11/multiprocessing/process.py", line 108, in run self._target(*self._args, **self._kwargs) File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/vllm/engine/multiprocessing/engine.py", line 368, in run_mp_engine raise e File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/vllm/engine/multiprocessing/engine.py", line 357, in run_mp_engine engine = MQLLMEngine.from_engine_args(engine_args=engine_args, ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/vllm/engine/multiprocessing/engine.py", line 114, in from_engine_args engine_config = engine_args.create_engine_config() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/vllm/engine/arg_utils.py", line 959, in create_engine_config model_config = self.create_model_config() ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/vllm/engine/arg_utils.py", line 891, in create_model_config return ModelConfig( ^^^^^^^^^^^^ File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/vllm/config.py", line 264, in init supported_tasks, task = self._resolve_task(task, self.hf_config) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/textgen/lib/python3.11/site-packages/vllm/config.py", line 347, in _resolve_task selected_task = next(iter(supported_tasks_lst)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ StopIteration

shunxing12345 commented 4 days ago

vllm 已经支持telechat2 可以在官网拉取vllm最新代码安装vllm使用telechat2