OpenGVLab / InternVL

[CVPR 2024 Oral] InternVL Family: A Pioneering Open-Source Alternative to GPT-4o. 接近GPT-4o表现的开源多模态对话模型
https://internvl.readthedocs.io/en/latest/
MIT License
5.67k stars 440 forks source link

使用lmdeploy0.5.0部署internVL2-26B出现'NoneType' object has no attribute 'split'错误 #336

Closed WangJianQ-cmd closed 2 months ago

WangJianQ-cmd commented 3 months ago

具体报错细节如下: ERROR: Exception in ASGI application Traceback (most recent call last): File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 411, in run_asgi result = await app( # type: ignore[func-returns-value] File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 69, in call return await self.app(scope, receive, send) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in call await super().call(scope, receive, send) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/applications.py", line 123, in call await self.middleware_stack(scope, receive, send) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in call raise exc File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in call await self.app(scope, receive, _send) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/middleware/cors.py", line 85, in call await self.app(scope, receive, send) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in call await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app raise exc File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/routing.py", line 756, in call await self.middleware_stack(scope, receive, send) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/routing.py", line 776, in app await route.handle(scope, receive, send) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle await self.app(scope, receive, send) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/routing.py", line 77, in app await wrap_app_handling_exceptions(app, request)(scope, receive, send) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app raise exc File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/starlette/routing.py", line 72, in app response = await func(request) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/fastapi/routing.py", line 278, in app raw_response = await run_endpoint_function( File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/fastapi/routing.py", line 191, in run_endpoint_function return await dependant.call(**values) File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/lmdeploy/serve/openai/api_server.py", line 495, in chat_completions_v1 async for res in result_generator: File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/lmdeploy/serve/async_engine.py", line 563, in generate prompt_input = await self._get_prompt_input(prompt, do_preprocess, File "/root/miniforge3/envs/Yi/lib/python3.10/site-packages/lmdeploy/serve/vl_async_engine.py", line 54, in _get_prompt_input segs = decorated.split(IMAGE_TOKEN) AttributeError: 'NoneType' object has no attribute 'split'

lmdeploy版本为0.5.0,使用0.4.2版本没有问题

nth2000 commented 3 months ago

Traceback (most recent call last): File "/home/thniu/Chart2Code/src/chart_synthesis/LLM_synthesis/MM_Text_LLM/infer.py", line 8, in response = pipe(('describe this image', image)) File "/home/thniu/miniconda3/envs/internvl/lib/python3.10/site-packages/lmdeploy/serve/vl_async_engine.py", line 118, in call return super().call(prompts, kwargs) File "/home/thniu/miniconda3/envs/internvl/lib/python3.10/site-packages/lmdeploy/serve/async_engine.py", line 304, in call return self.batch_infer(prompts, File "/home/thniu/miniconda3/envs/internvl/lib/python3.10/site-packages/lmdeploy/serve/vl_async_engine.py", line 104, in batch_infer return super().batch_infer(prompts, kwargs) File "/home/thniu/miniconda3/envs/internvl/lib/python3.10/site-packages/lmdeploy/serve/async_engine.py", line 428, in batch_infer _get_event_loop().run_until_complete(gather()) File "/home/thniu/miniconda3/envs/internvl/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete return future.result() File "/home/thniu/miniconda3/envs/internvl/lib/python3.10/site-packages/lmdeploy/serve/async_engine.py", line 425, in gather await asyncio.gather( File "/home/thniu/miniconda3/envs/internvl/lib/python3.10/site-packages/lmdeploy/serve/async_engine.py", line 410, in _inner_call async for out in generator: File "/home/thniu/miniconda3/envs/internvl/lib/python3.10/site-packages/lmdeploy/serve/async_engine.py", line 563, in generate prompt_input = await self._get_prompt_input(prompt, do_preprocess, File "/home/thniu/miniconda3/envs/internvl/lib/python3.10/site-packages/lmdeploy/serve/vl_async_engine.py", line 54, in _get_prompt_input segs = decorated.split(IMAGE_TOKEN) AttributeError: 'NoneType' object has no attribute 'split'

也有这个问题

teddyliu01 commented 3 months ago

相同的问题 版本 0.5.0

klucius1218 commented 2 months ago

运行8b和2b同样出现这个问题,试了lmdeploy 0.4.2也不行

shiva-vardhineedi commented 2 months ago

+1

shiva-vardhineedi commented 2 months ago

guys rename your model to have Mini-InternVL-Chat-2B-V1-5 in the name... then it worked for me. on lmdeploy v0.5.0

whai362 commented 2 months ago

guys rename your model to have Mini-InternVL-Chat-2B-V1-5 in the name... then it worked for me. on lmdeploy v0.5.0

Thanks. We will release the official code and weights for lmdeploy.

xylovezxy commented 2 months ago

请问你们解决这个问题了吗,我部署internVL2-8B也出现了这个报错

shiva-vardhineedi commented 2 months ago

Have you solved this problem? I also got this error when deploying internVL2-8B

将您的模型重命名为名称中包含 Mini-InternVL-Chat-2B-V1-5...然后它对我有用。在 lmdeploy v0.5.0 上

teddyliu01 commented 2 months ago

没搞定呢

---- 回复的原邮件 ---- | 发件人 | @.> | | 发送日期 | 2024年07月15日 16:11 | | 收件人 | OpenGVLab/InternVL @.> | | 抄送人 | teddyliu @.>, Comment @.> | | 主题 | Re: [OpenGVLab/InternVL] 使用lmdeploy0.5.0部署internVL2-26B出现'NoneType' object has no attribute 'split'错误 (Issue #336) |

请问你们解决这个问题了吗,我部署internVL2-8B也出现了这个报错

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>

xylovezxy commented 2 months ago

上面说的方法好使

Leoooooo123 commented 2 months ago

guys rename your model to have Mini-InternVL-Chat-2B-V1-5 in the name... then it worked for me. on lmdeploy v0.5.0

Is the model output normal in this way? I think this might cause some configuration mismatches in the model, leading to abnormal output. My model produced token IDs that are not in the vocabulary.

feihuamantian commented 2 months ago

https://github.com/OpenGVLab/InternVL/issues/351

wufg2002 commented 2 months ago

有没有解决方案?

ErfeiCui commented 2 months ago

代码已经更新,可以拉取新的代码,注意环境严格对齐

lyc728 commented 2 months ago

代码已经更新,可以拉取新的代码,注意环境严格对齐

我用minicppm v2.5也是报这个错

czczup commented 2 months ago

升级lmdeploy到0.5.1及以上的版本即可正常运行

wufg2002 commented 2 months ago

收到。谢谢

---原始邮件--- 发件人: "Zhe @.> 发送时间: 2024年7月31日(周三) 下午2:41 收件人: @.>; 抄送: @.**@.>; 主题: Re: [OpenGVLab/InternVL] 使用lmdeploy0.5.0部署internVL2-26B出现'NoneType' object has no attribute 'split'错误 (Issue #336)

Closed #336 as completed.

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you commented.Message ID: @.***>