Open Vimos opened 2 months ago
发现上面的启动命令少了--visual_inputs
但是补充之后出现
08/09/2024 21:42:08 - INFO - llamafactory.data.template - Replace eos token: <|eot_id|>
2024-08-09 21:42:08 | INFO | llamafactory.data.template | Replace eos token: <|eot_id|>
2024-08-09 21:42:08 | ERROR | stderr | Traceback (most recent call last):
2024-08-09 21:42:08 | ERROR | stderr | File "<frozen runpy>", line 198, in _run_module_as_main
2024-08-09 21:42:08 | ERROR | stderr | File "<frozen runpy>", line 88, in _run_code
2024-08-09 21:42:08 | ERROR | stderr | File "/home/vimos/git/XinHaiLLM/backend/src/xinhai/workers/mllm.py", line 490, in <module>
2024-08-09 21:42:08 | ERROR | stderr | worker = MLLMWorker()
2024-08-09 21:42:08 | ERROR | stderr | ^^^^^^^^^^^^
2024-08-09 21:42:08 | ERROR | stderr | File "/home/vimos/git/XinHaiLLM/backend/src/xinhai/workers/mllm.py", line 89, in __init__
2024-08-09 21:42:08 | ERROR | stderr | self.engine: "BaseEngine" = VllmEngine(model_args, data_args, finetuning_args, generating_args)
2024-08-09 21:42:08 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-09 21:42:08 | ERROR | stderr | File "/home/vimos/git/XinHaiLLM/related_repos/LLaMA-Factory/src/llamafactory/chat/vllm_engine.py", line 102, in __init__
2024-08-09 21:42:08 | ERROR | stderr | self.model = AsyncLLMEngine.from_engine_args(AsyncEngineArgs(**engine_args))
2024-08-09 21:42:08 | ERROR | stderr | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-08-09 21:42:08 | ERROR | stderr | TypeError: AsyncEngineArgs.__init__() got an unexpected keyword argument 'image_input_type'
应该还是版本适配问题
是不是不支持minicpm-v微调啊
Reminder
System Info
github head
Reproduction
Expected behavior
No response
Others
No response