Uncaught exception: Traceback (most recent call last): File "D:\Big_model\ChatGLM\GLM-4-main\composite_demo\src\main.py", line 288, in main for response, chat_history in client.generate_stream( File "D:\Big_model\ChatGLM\GLM-4-main\composite_demo\src\clients\hf.py", line 57, in generate_stream for token_text in streamer: File "C:\Miniconda3\envs\VL_model\lib\site-packages\transformers\generation\streamers.py", line 223, in next value = self.text_queue.get(timeout=self.timeout) File "C:\Miniconda3\envs\VL_model\lib\queue.py", line 179, in get raise Empty _queue.Empty
Uncaught exception: Traceback (most recent call last): File "D:\Big_model\ChatGLM\GLM-4-main\composite_demo\src\main.py", line 288, in main for response, chat_history in client.generate_stream( File "D:\Big_model\ChatGLM\GLM-4-main\composite_demo\src\clients\hf.py", line 57, in generate_stream for token_text in streamer: File "C:\Miniconda3\envs\VL_model\lib\site-packages\transformers\generation\streamers.py", line 223, in next value = self.text_queue.get(timeout=self.timeout) File "C:\Miniconda3\envs\VL_model\lib\queue.py", line 179, in get raise Empty _queue.Empty
Uncaught exception: Traceback (most recent call last): File "D:\Big_model\ChatGLM\GLM-4-main\composite_demo\src\main.py", line 288, in main for response, chat_history in client.generate_stream( File "D:\Big_model\ChatGLM\GLM-4-main\composite_demo\src\clients\hf.py", line 57, in generate_stream for token_text in streamer: File "C:\Miniconda3\envs\VL_model\lib\site-packages\transformers\generation\streamers.py", line 223, in next value = self.text_queue.get(timeout=self.timeout) File "C:\Miniconda3\envs\VL_model\lib\queue.py", line 179, in get raise Empty _queue.Empty
System Info / 系統信息
Uncaught exception: Traceback (most recent call last): File "D:\Big_model\ChatGLM\GLM-4-main\composite_demo\src\main.py", line 288, in main for response, chat_history in client.generate_stream( File "D:\Big_model\ChatGLM\GLM-4-main\composite_demo\src\clients\hf.py", line 57, in generate_stream for token_text in streamer: File "C:\Miniconda3\envs\VL_model\lib\site-packages\transformers\generation\streamers.py", line 223, in next value = self.text_queue.get(timeout=self.timeout) File "C:\Miniconda3\envs\VL_model\lib\queue.py", line 179, in get raise Empty _queue.Empty
您好我在执行glm4v - 9b模型的时候报错了,显卡是4090 24G。图片和文字一起发送返回的错误
Who can help? / 谁可以帮助到您?
No response
Information / 问题信息
Reproduction / 复现过程
Uncaught exception: Traceback (most recent call last): File "D:\Big_model\ChatGLM\GLM-4-main\composite_demo\src\main.py", line 288, in main for response, chat_history in client.generate_stream( File "D:\Big_model\ChatGLM\GLM-4-main\composite_demo\src\clients\hf.py", line 57, in generate_stream for token_text in streamer: File "C:\Miniconda3\envs\VL_model\lib\site-packages\transformers\generation\streamers.py", line 223, in next value = self.text_queue.get(timeout=self.timeout) File "C:\Miniconda3\envs\VL_model\lib\queue.py", line 179, in get raise Empty _queue.Empty
您好我在执行glm4v - 9b模型的时候报错了,显卡是4090 24G。图片和文字一起发送返回的错误
Expected behavior / 期待表现
Uncaught exception: Traceback (most recent call last): File "D:\Big_model\ChatGLM\GLM-4-main\composite_demo\src\main.py", line 288, in main for response, chat_history in client.generate_stream( File "D:\Big_model\ChatGLM\GLM-4-main\composite_demo\src\clients\hf.py", line 57, in generate_stream for token_text in streamer: File "C:\Miniconda3\envs\VL_model\lib\site-packages\transformers\generation\streamers.py", line 223, in next value = self.text_queue.get(timeout=self.timeout) File "C:\Miniconda3\envs\VL_model\lib\queue.py", line 179, in get raise Empty _queue.Empty
您好我在执行glm4v - 9b模型的时候报错了,显卡是4090 24G。图片和文字一起发送返回的错误