Closed MarkHe735 closed 4 months ago
🤖
Hey there, @MarkHe735! Nice to see you diving into the depths of code again. Hope this one's a fun puzzle for us to solve together. 😉
Based on the error message you provided, it seems that the issue is related to the chunk["choices"]
variable in the _astream
method of the openai.py
file. This variable is expected to be a list, but in your case, it appears to be None
, hence the TypeError: object of type 'NoneType' has no len()
error.
There are a few similar issues in the Langchain-Chatchat repository that have been resolved:
TypeError: object of type 'NoneType' has no len() - The suggested solution here is to ensure that your GPU has enough memory if you are running the models on a GPU. If you are running the models on a CPU and still encountering the error, it might be due to the model not being loaded properly. One user suggested replacing baichuan2-13b
with baichuan2-13b-chat-4bits
and installing a specific version of bitsandbytes
for Windows.
[BUG] model_worker多卡加载后,模型返回结果明显错误 - This issue suggests that the problem may be due to the fact that multi-GPU setup is only used to share memory load and not for actual parallel computation, which means that the model is still running on a single GPU. This could explain why the results are not as expected when using multiple GPUs.
langchain-chathat 调用Dify应用的api接口时报错: ERROR: TypeError: Caught exception: object of type 'NoneType' has no len() - The suggested solution here is to check if chunk["choices"]
is None
before trying to get its length.
Based on these solutions, you could try the following:
Check your GPU memory usage. If it's high, try moving the embedding model to the CPU to free up some GPU memory.
If you're running the models on a CPU, try replacing baichuan2-13b
with baichuan2-13b-chat-4bits
and installing the specific version of bitsandbytes
mentioned above.
Modify the openai.py
file to check if chunk["choices"]
is None
before trying to get its length.
Please note that these solutions are based on user comments and may not work for everyone. If you are still facing issues, it might be helpful to provide more details about the error and your setup for further assistance.
As for the _astream
method in the openai.py
file, I wasn't able to find an exact function within the repository. However, based on the name and common usage, it's likely that this method is used for asynchronous streaming of data.
The expected type and structure of chunk["choices"]
in the _astream
method is likely to be a list of dictionaries, with each dictionary representing a choice from the output of an AI model. This is based on the common structure of responses from AI language models, such as those provided by OpenAI's GPT models.
I hope this helps! If you have any other questions or need further clarification, please let me know.
这个问题已经被标记为 stale
,因为它已经超过 30 天没有任何活动。
问题描述 / Problem Description 文件对话报错:packages/langchain_community/chat_models/openai.py", line 495, in _astream TypeError: object of type 'NoneType' has no len()
复现问题的步骤 / Steps to Reproduce 拖拽并上传英文文档后,提示词输入翻译为中文,前台没有输出,后台报错:TypeError: object of type 'NoneType' has no len()
实际结果 / Actual Result 空
环境信息 / Environment Information 操作系统:Linux-5.15.0-94-generic-x86_64-with-glibc2.35. python版本:3.10.13 (main, Sep 11 2023, 13:44:35) [GCC 11.2.0] 项目版本:v0.2.10 langchain版本:0.0.354. fastchat版本:0.2.35
当前使用的分词器:ChineseRecursiveTextSplitter 当前启动的LLM模型:['Qwen-14B-Chat'] @ cuda {'device': 'cuda', 'gpus': '0,1,2,3', 'host': '0.0.0.0', 'infer_turbo': False, 'max_gpu_memory': '32GiB', 'model_path': '/home/hdw/python_projs/models/Qwen-14B-Chat', 'model_path_exists': True, 'num_gpus': 1, 'port': 20002} 当前Embbedings模型: piccolo-large-zh @ cuda
**附加信息:日志: INFO: 127.0.0.1:56076 - "POST /llm_model/list_running_models HTTP/1.1" 200 OK 2024-02-28 01:58:38,174 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_running_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:56076 - "POST /llm_model/list_config_models HTTP/1.1" 200 OK 2024-02-28 01:58:38,229 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/llm_model/list_config_models "HTTP/1.1 200 OK" INFO: 127.0.0.1:56076 - "POST /chat/file_chat HTTP/1.1" 200 OK /home/hdw/miniconda3/envs/langchain_chatchat/lib/python3.10/site-packages/langchain_core/_api/deprecation.py:117: LangChainDeprecationWarning: The class
langchain_community.chat_models.openai.ChatOpenAI
was deprecated in langchain-community 0.0.10 and will be removed in 0.2.0. An updated version of the class exists in the langchain-openai package and should be used instead. To use it runpip install -U langchain-openai
and import asfrom langchain_openai import ChatOpenAI
. warn_deprecated( 2024-02-28 01:58:38,470 - _client.py[line:1027] - INFO: HTTP Request: POST http://127.0.0.1:7861/chat/file_chat "HTTP/1.1 200 OK" 2024-02-28 01:58:38,523 - base.py[line:36] - INFO: thread 219585 开始操作:tmpc36osenw。 2024-02-28 01:58:38,525 - base.py[line:40] - INFO: thread 219585 结束操作:tmpc36osenw。