ParisNeo / lollms-webui

Lord of Large Language Models Web User Interface
https://lollms.com
Apache License 2.0
4.27k stars 537 forks source link

Error when trying to use Petals. #464

Open d13g4 opened 9 months ago

d13g4 commented 9 months ago

Expected Behavior

Text generation

Current Behavior

No generation

Steps to Reproduce

Using the (as of now) newest version of lollms from the git-repo.

  1. Binding: Petals (New)
  2. Model: petals-team/StableBeluga2
  3. Say "Hi" in the Chat

Context

Text generation requested by client: uApY51I2lZWiP5AAAD Received message : hi! Started generation task Exception in thread Thread-11 (start_message_generation): Traceback (most recent call last): File "/usr/lib/python3.10/threading.py", line 1016, in _bootstrap_inner self.run() File "/usr/lib/python3.10/threading.py", line 953, in run self._target(*self._args, **self._kwargs) File "/home/diemo/lollms-webui/api/init.py", line 2040, in start_message_generation self.discussion_messages, self.current_message, tokens = self.prepare_query(client_id, message_id, is_continue, n_tokens=self.config.min_n_predict, generation_type=generation_type) File "/home/diemo/lollms-webui/api/init__.py", line 1628, in prepare_query discussion_messages += self.model.detokenize(message_tokens) File "/home/diemo/lollms-webui/zoos/bindings_zoo/bs_petals/init.py", line 230, in detokenize t = torch.IntTensor([tokens_list]) NameError: name 'torch' is not defined

ParisNeo commented 9 months ago

Hi. If you are on windows, then you need to use WSL and it should work. If you are on linux, this looks like you did not install the binding correctly, try to install it again and tell me if you get an error.