Open zsp0901 opened 1 month ago
I meet the same question. I use ep-xxx-xxx
to add chat LLM is successful, but failed when adding embedding model.
I have the same problem: when add Volcengine embedding model, pop error: KeyError('VolcEngine') but the chat model was added successful
File "/ragflow/api/apps/llm_app.py", line 209, in add_llm mdl = EmbeddingModel[factory](
KeyError: 'VolcEngine'
'VolcEngine'
Traceback (most recent call last):
File "/ragflow/.venv/lib/python3.12/site-packages/flask/app.py", line 880, in full_dispatch_request
rv = self.dispatch_request()
^^^^^^^^^^^^^^^^^^^^^^^
File "/ragflow/.venv/lib/python3.12/site-packages/flask/app.py", line 865, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args) # type: ignore[no-any-return]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/ragflow/.venv/lib/python3.12/site-packages/flask_login/utils.py", line 290, in decorated_view
return current_app.ensure_sync(func)(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/ragflow/api/utils/api_utils.py", line 175, in decorated_function
return func(*_args, **_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^
File "/ragflow/api/apps/llm_app.py", line 209, in add_llm
mdl = EmbeddingModel[factory](
添加豆包在线推理模型报错了
Describe your problem
No matter I used Volcengine or other ways, e.g. OpenAI compatible, OpenRouter, it just got KeyError or other errors.
I tested both "Daobao-embedding" and "ep-xxxx" endpoint ID, it just failed.
base url: https://ark.cn-beijing.volces.com/api/v3
But I can add Volcengine chat model using OpenRouter.
Anyone help? Many thanks!!!