Closed umbraclet16 closed 2 months ago
0.3.1 版已经发布,优化了配置方式,修改配置项无需重启服务器,可以更新尝试。
定位问题在0.3.0版本的AutoDL镜像中model_providers部分代码,在此仅用于记录:
Langchain-Chatchat/libs/model-providers/model_providers/bootstrap_web/openai_bootstrap_web.py
create_embeddings()函数中if判断条件有问题,错误地进入了if分支。
简单解决:注释掉if else逻辑,改为:
input_embeddings_request.input
if isinstance(input, list):
input = input[0]
另: list_models()存在几处变量错误。
I registered an embedding model 'autodl-tmp-bge-large-zh' in xinference and configured it in /root/chatchat-data/model_providers.yaml to use custom platform loading.
Running command chatchat-kb -r failed.
relevant log:
model_providers.bootstrap_web.openai_bootstrap_web 2700 WARNING Warning: model not found. Using cl100k_base encoding.
Internet is unavailable so the default model cannot be downloaded.
I wonder if model_providers only supports builtin embedding models? Or did I miss any configuration?
Thx.