2024-08-17 21:08:57 guojie dbgpt.model.llm_out.hf_chat_llm[4218] INFO Predict with parameters: {'max_length': 128000, 'temperature': 0.8, 'streamer': <transformers.generation.streamers.TextIteratorStreamer object at 0x17642d540>, 'top_p': 1.0, 'do_sample': True}
custom_stop_words: []
Exception in thread Thread-7 (generate):
Traceback (most recent call last):
File "/Users/guojie/miniconda3/envs/dbgpt0510/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
self.run()
File "/Users/guojie/miniconda3/envs/dbgpt0510/lib/python3.10/threading.py", line 953, in run
self._target(*self._args, *self._kwargs)
File "/Users/guojie/miniconda3/envs/dbgpt0510/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(args, **kwargs)
File "/Users/guojie/miniconda3/envs/dbgpt0510/lib/python3.10/site-packages/transformers/generation/utils.py", line 1713, in generate
self._prepare_special_tokens(generation_config, kwargs_has_attention_mask, device=device)
File "/Users/guojie/miniconda3/envs/dbgpt0510/lib/python3.10/site-packages/transformers/generation/utils.py", line 1562, in _prepare_special_tokens
and torch.isin(elements=eos_token_tensor, test_elements=pad_token_tensor).any()
NotImplementedError: The operator 'aten::isin.Tensor_Tensor_out' is not currently implemented for the MPS device. If you want this op to be added in priority during the prototype phase of this feature, please comment on https://github.com/pytorch/pytorch/issues/77764. As a temporary fix, you can set the environment variable PYTORCH_ENABLE_MPS_FALLBACK=1 to use the CPU as a fallback for this op. WARNING: this will be slower than running natively on MPS.
What you expected to happen
what should I do to fix this problem , if I want to chat with excel? Thanks very much
How to reproduce
just install DB-GPT on macbookpro m1, and then use chat excel
Search before asking
Operating system information
MacOS(M1, M2...)
Python version information
3.10
DB-GPT version
main
Related scenes
Installation Information
[X] Installation From Source
[ ] Docker Installation
[ ] Docker Compose Installation
[ ] Cluster Installation
[ ] AutoDL Image
[ ] Other
Device information
macbookpro m1 RAM 64G
Models information
LLM: glm-4-9b-chat embeddingmodel: text2vec-large-chinese
What happened
2024-08-17 21:08:57 guojie dbgpt.model.llm_out.hf_chat_llm[4218] INFO Predict with parameters: {'max_length': 128000, 'temperature': 0.8, 'streamer': <transformers.generation.streamers.TextIteratorStreamer object at 0x17642d540>, 'top_p': 1.0, 'do_sample': True} custom_stop_words: [] Exception in thread Thread-7 (generate): Traceback (most recent call last): File "/Users/guojie/miniconda3/envs/dbgpt0510/lib/python3.10/threading.py", line 1016, in _bootstrap_inner self.run() File "/Users/guojie/miniconda3/envs/dbgpt0510/lib/python3.10/threading.py", line 953, in run self._target(*self._args, *self._kwargs) File "/Users/guojie/miniconda3/envs/dbgpt0510/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(args, **kwargs) File "/Users/guojie/miniconda3/envs/dbgpt0510/lib/python3.10/site-packages/transformers/generation/utils.py", line 1713, in generate self._prepare_special_tokens(generation_config, kwargs_has_attention_mask, device=device) File "/Users/guojie/miniconda3/envs/dbgpt0510/lib/python3.10/site-packages/transformers/generation/utils.py", line 1562, in _prepare_special_tokens and torch.isin(elements=eos_token_tensor, test_elements=pad_token_tensor).any() NotImplementedError: The operator 'aten::isin.Tensor_Tensor_out' is not currently implemented for the MPS device. If you want this op to be added in priority during the prototype phase of this feature, please comment on https://github.com/pytorch/pytorch/issues/77764. As a temporary fix, you can set the environment variable
PYTORCH_ENABLE_MPS_FALLBACK=1
to use the CPU as a fallback for this op. WARNING: this will be slower than running natively on MPS.What you expected to happen
what should I do to fix this problem , if I want to chat with excel? Thanks very much
How to reproduce
just install DB-GPT on macbookpro m1, and then use chat excel
Additional context
No response
Are you willing to submit PR?