anarchy-ai / LLM-VM

irresponsible innovation. Try now at https://chat.dev/
https://anarchy.ai/
MIT License
477 stars 149 forks source link

BaseOnsiteLLM.generate() got an unexpected keyword argument 'temperature' #328

Closed daspartho closed 10 months ago

daspartho commented 11 months ago

running into following error when running local endpoint quickstart example

Using model: bloom
Running with an empty context
Exception in thread Thread-2 (new_thread):
Traceback (most recent call last):
  File "/home/codespace/.python/current/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
    self.run()
  File "/home/codespace/.python/current/lib/python3.10/threading.py", line 953, in run
    self._target(*self._args, **self._kwargs)
  File "/home/codespace/.python/current/lib/python3.10/site-packages/llm_vm/completion/optimize.py", line 43, in new_thread
    t[0] = foo()
  File "/home/codespace/.python/current/lib/python3.10/site-packages/llm_vm/completion/optimize.py", line 254, in promiseCompletion
    best_completion = self.call_big(prompt, **kwargs)
  File "/home/codespace/.python/current/lib/python3.10/site-packages/llm_vm/client.py", line 96, in CALL_BIG
    return self.teacher.generate(prompt, max_len,**kwargs)
TypeError: BaseOnsiteLLM.generate() got an unexpected keyword argument 'temperature'
{'status': 0, 'resp': 'cannot unpack non-iterable NoneType object'}
daspartho commented 11 months ago

client.complete by default assigns temperature as keyword argument while only openai models support it

https://github.com/anarchy-ai/LLM-VM/blob/7d2a52a63027bf32712409405a8df3697e5a9ec2/src/llm_vm/client.py#L129

one way to resolve above error will be to have a conditional check and only assign temperature as keyword argument for openai models.

another better way will be to support the temperature argument support for local models too!