[WARNING] 未检测到GPU,将使用CPU进行推理
C:\Users\chen\.conda\envs\Muice\lib\site-packages\transformers\utils\generic.py:260: FutureWarning: `torch.utils._pytree._register_pytree_node` is deprecated. Please use `torch.utils._pytree.register_pytree_node` instead.
torch.utils._pytree._register_pytree_node(
C:\Users\chen\.conda\envs\Muice\lib\site-packages\transformers\modeling_utils.py:479: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
return torch.load(checkpoint_file, map_location=map_location)
[WARNING] Failed to load cpm_kernels:name 'CPUKernel' is not defined
Some weights of ChatGLMForConditionalGeneration were not initialized from the model checkpoint at ./model/chatglm2-6b-int4 and are newly initialized: ['transformer.prefix_encoder.embedding.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
F:\ai\Muice-Chatbot\llm\transformers.py:22: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
prefix_state_dict = torch.load(os.path.join(pt_model_path, "pytorch_model.bin"), map_location='cpu')
INFO: Started server process [22604]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:21050 (Press CTRL+C to quit)
INFO: ('127.0.0.1', 13513) - "WebSocket /ws/api" [accepted]
INFO: connection open
[INFO] 已链接
[INFO] 收到QQ**********的消息:
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\uvicorn\protocols\websockets\websockets_impl.py", line 244, in run_asgi
result = await self.app(self.scope, self.asgi_receive, self.asgi_send) # type: ignore[func-returns-value]
File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 70, in __call__
return await self.app(scope, receive, send)
File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\fastapi\applications.py", line 1054, in __call__
await super().__call__(scope, receive, send)
File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\applications.py", line 123, in __call__
await self.middleware_stack(scope, receive, send)
File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\middleware\errors.py", line 151, in __call__
await self.app(scope, receive, send)
File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\middleware\exceptions.py", line 65, in __call__
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\_exception_handler.py", line 64, in wrapped_app
raise exc
File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\routing.py", line 754, in __call__
await self.middleware_stack(scope, receive, send)
File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\routing.py", line 774, in app
await route.handle(scope, receive, send)
File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\routing.py", line 371, in handle
await self.app(scope, receive, send)
File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\routing.py", line 96, in app
await wrap_app_handling_exceptions(app, session)(scope, receive, send)
File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\_exception_handler.py", line 64, in wrapped_app
raise exc
File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\_exception_handler.py", line 53, in wrapped_app
await app(scope, receive, sender)
File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\routing.py", line 94, in app
await func(session)
File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\fastapi\routing.py", line 367, in app
await dependant.call(**values)
File "F:\ai\Muice-Chatbot\ws.py", line 88, in websocket_endpoint
for reply_item in reply_list:
TypeError: 'NoneType' object is not iterable
INFO: connection closed
INFO: ('127.0.0.1', 13521) - "WebSocket /ws/api" [accepted]
INFO: connection open
[INFO] 已链接
描述此Bug
在接收到含有qq表情的消息时报错,并自动重启。
控制台输出
复现该Bug的操作
在聊天时发送包含qq表情的消息
额外信息
使用的模型:ChatGLM2-6B-Int4 + Muice-2.4-chatglm2-6b-int4