Moemu / Muice-Chatbot

沐雪,一个会自动找你聊天的AI女孩子
MIT License
276 stars 22 forks source link

[Bug]在接收到含有qq表情的消息时报错 #52

Closed zhiming2008 closed 2 months ago

zhiming2008 commented 2 months ago

描述此Bug

在接收到含有qq表情的消息时报错,并自动重启。

控制台输出

[WARNING] 未检测到GPU,将使用CPU进行推理
C:\Users\chen\.conda\envs\Muice\lib\site-packages\transformers\utils\generic.py:260: FutureWarning: `torch.utils._pytree._register_pytree_node` is deprecated. Please use `torch.utils._pytree.register_pytree_node` instead.
  torch.utils._pytree._register_pytree_node(
C:\Users\chen\.conda\envs\Muice\lib\site-packages\transformers\modeling_utils.py:479: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
  return torch.load(checkpoint_file, map_location=map_location)
[WARNING] Failed to load cpm_kernels:name 'CPUKernel' is not defined
Some weights of ChatGLMForConditionalGeneration were not initialized from the model checkpoint at ./model/chatglm2-6b-int4 and are newly initialized: ['transformer.prefix_encoder.embedding.weight']
You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference.
F:\ai\Muice-Chatbot\llm\transformers.py:22: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
  prefix_state_dict = torch.load(os.path.join(pt_model_path, "pytorch_model.bin"), map_location='cpu')
INFO:     Started server process [22604]
INFO:     Waiting for application startup.
INFO:     Application startup complete.
INFO:     Uvicorn running on http://127.0.0.1:21050 (Press CTRL+C to quit)
INFO:     ('127.0.0.1', 13513) - "WebSocket /ws/api" [accepted]
INFO:     connection open
[INFO] 已链接
[INFO] 收到QQ**********的消息:
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\uvicorn\protocols\websockets\websockets_impl.py", line 244, in run_asgi
    result = await self.app(self.scope, self.asgi_receive, self.asgi_send)  # type: ignore[func-returns-value]
  File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 70, in __call__
    return await self.app(scope, receive, send)
  File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\fastapi\applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\middleware\errors.py", line 151, in __call__
    await self.app(scope, receive, send)
  File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\middleware\exceptions.py", line 65, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\routing.py", line 754, in __call__
    await self.middleware_stack(scope, receive, send)
  File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\routing.py", line 774, in app
    await route.handle(scope, receive, send)
  File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\routing.py", line 371, in handle
    await self.app(scope, receive, send)
  File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\routing.py", line 96, in app
    await wrap_app_handling_exceptions(app, session)(scope, receive, send)
  File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\starlette\routing.py", line 94, in app
    await func(session)
  File "C:\Users\chen\.conda\envs\Muice\lib\site-packages\fastapi\routing.py", line 367, in app
    await dependant.call(**values)
  File "F:\ai\Muice-Chatbot\ws.py", line 88, in websocket_endpoint
    for reply_item in reply_list:
TypeError: 'NoneType' object is not iterable
INFO:     connection closed
INFO:     ('127.0.0.1', 13521) - "WebSocket /ws/api" [accepted]
INFO:     connection open
[INFO] 已链接

复现该Bug的操作

在聊天时发送包含qq表情的消息

额外信息

使用的模型:ChatGLM2-6B-Int4 + Muice-2.4-chatglm2-6b-int4

Moemu commented 2 months ago

请问这个“含有qq表情”是只有QQ表情吗?如果可以,请你举一个例子给我

zhiming2008 commented 2 months ago

是的,是我表述不对,刚刚试了一下,只有单独的一个表情才会,普通文本里的会被自动忽略

Moemu commented 2 months ago

我们已确认这个问题只会在QQ自带的表情上产生并作出了更改,与此同时我建议您使用手机的emoji来替代QQ自带的表情

Moemu commented 2 months ago

还请您拉取代码来同步修改,并确认这个问题是否存在

MoeSnowyFox commented 2 months ago

现在在reply_list为None时不会回复,不再报错

zhiming2008 commented 2 months ago

谢谢,问题已经被解决