Closed WangJianQ-cmd closed 2 months ago
Traceback (most recent call last):
File "/home/thniu/Chart2Code/src/chart_synthesis/LLM_synthesis/MM_Text_LLM/infer.py", line 8, in
也有这个问题
相同的问题 版本 0.5.0
运行8b和2b同样出现这个问题,试了lmdeploy 0.4.2也不行
+1
guys rename your model to have Mini-InternVL-Chat-2B-V1-5 in the name... then it worked for me. on lmdeploy v0.5.0
guys rename your model to have Mini-InternVL-Chat-2B-V1-5 in the name... then it worked for me. on lmdeploy v0.5.0
Thanks. We will release the official code and weights for lmdeploy.
请问你们解决这个问题了吗,我部署internVL2-8B也出现了这个报错
Have you solved this problem? I also got this error when deploying internVL2-8B
将您的模型重命名为名称中包含 Mini-InternVL-Chat-2B-V1-5...然后它对我有用。在 lmdeploy v0.5.0 上
没搞定呢
---- 回复的原邮件 ---- | 发件人 | @.> | | 发送日期 | 2024年07月15日 16:11 | | 收件人 | OpenGVLab/InternVL @.> | | 抄送人 | teddyliu @.>, Comment @.> | | 主题 | Re: [OpenGVLab/InternVL] 使用lmdeploy0.5.0部署internVL2-26B出现'NoneType' object has no attribute 'split'错误 (Issue #336) |
请问你们解决这个问题了吗,我部署internVL2-8B也出现了这个报错
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>
上面说的方法好使
guys rename your model to have Mini-InternVL-Chat-2B-V1-5 in the name... then it worked for me. on lmdeploy v0.5.0
Is the model output normal in this way? I think this might cause some configuration mismatches in the model, leading to abnormal output. My model produced token IDs that are not in the vocabulary.
有没有解决方案?
代码已经更新,可以拉取新的代码,注意环境严格对齐
代码已经更新,可以拉取新的代码,注意环境严格对齐
我用minicppm v2.5也是报这个错
升级lmdeploy到0.5.1及以上的版本即可正常运行
收到。谢谢
---原始邮件--- 发件人: "Zhe @.> 发送时间: 2024年7月31日(周三) 下午2:41 收件人: @.>; 抄送: @.**@.>; 主题: Re: [OpenGVLab/InternVL] 使用lmdeploy0.5.0部署internVL2-26B出现'NoneType' object has no attribute 'split'错误 (Issue #336)
Closed #336 as completed.
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>
具体报错细节如下: ERROR: Exception in ASGI application Traceback (most recent call last): File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 411, in run_asgi result = await app( # type: ignore[func-returns-value] File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in call return await self.app(scope, receive, send) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in call await super().call(scope, receive, send) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/applications.py", line 123, in call await self.middleware_stack(scope, receive, send) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in call raise exc File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in call await self.app(scope, receive, _send) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/middleware/cors.py", line 85, in call await self.app(scope, receive, send) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in call await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app raise exc File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/routing.py", line 756, in call await self.middleware_stack(scope, receive, send) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/routing.py", line 776, in app await route.handle(scope, receive, send) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle await self.app(scope, receive, send) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/routing.py", line 77, in app await wrap_app_handling_exceptions(app, request)(scope, receive, send) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app raise exc File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/routing.py", line 72, in app response = await func(request) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/fastapi/routing.py", line 278, in app raw_response = await run_endpoint_function( File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/fastapi/routing.py", line 191, in run_endpoint_function return await dependant.call(**values) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/lmdeploy/serve/openai/api_server.py", line 495, in chat_completions_v1 async for res in result_generator: File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/lmdeploy/serve/async_engine.py", line 563, in generate prompt_input = await self._get_prompt_input(prompt, do_preprocess, File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/lmdeploy/serve/vl_async_engine.py", line 54, in _get_prompt_input segs = decorated.split(IMAGE_TOKEN) AttributeError: 'NoneType' object has no attribute 'split'
lmdeploy版本为0.5.0,使用0.4.2版本没有问题