anarchy-ai / LLM-VM

irresponsible innovation. Try now at https://chat.dev/
https://anarchy.ai/
MIT License
477 stars 148 forks source link

Kwargs don't work for "big_model" that isn't OpenAI based #353

Closed abhigya-sodani closed 11 months ago

abhigya-sodani commented 11 months ago

When Bloom, for example, is the big model:

response = client.complete(prompt = "Answer question Q. ",context="Q: What is the currency in myanmmar",
                           openai_key=settings.openai_api_key,
                           temperature=0.0,
                           data_synthesis=True,
                           finetune=True,)

This call returns an error, because the temperature param cannot be processed.

  File "/opt/conda/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
    self.run()
  File "/opt/conda/lib/python3.10/threading.py", line 953, in run
    self._target(*self._args, **self._kwargs)
  File "/opt/conda/lib/python3.10/site-packages/llm_vm/completion/optimize.py", line 43, in new_thread
    t[0] = foo()
  File "/opt/conda/lib/python3.10/site-packages/llm_vm/completion/optimize.py", line 254, in promiseCompletion
    best_completion = self.call_big(prompt, **kwargs)
  File "/opt/conda/lib/python3.10/site-packages/llm_vm/client.py", line 100, in CALL_BIG
    return self.teacher.generate(prompt, max_len,**kwargs)
TypeError: BaseOnsiteLLM.generate() got an unexpected keyword argument 'temperature'
daspartho commented 11 months ago

duplicate #328