Closed Clivern closed 1 year ago
I will release a patch promptly.
Can you try again with 0.3.8
?
Can you try again with
0.3.8
?
Server starts but still error raised during API call. I guess because of GenerationInput
in the following lines
https://github.com/bentoml/OpenLLM/blob/main/openllm-python/src/openllm/_service.py#L41 https://github.com/bentoml/OpenLLM/blob/main/openllm-python/src/openllm/_service.py#L54
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/bentoml/_internal/server/http_app.py", line 341, in api_func
output = await api.func(*args)
File "/usr/local/lib/python3.10/dist-packages/openllm/_service.py", line 54, in generate_stream_v1
qa_inputs = openllm.GenerationInput.from_llm_config(llm_config)(**input_dict)
File "/usr/local/lib/python3.10/dist-packages/openllm_core/utils/lazy.py", line 155, in __getattr__
raise AttributeError(f'module {self.__name__} has no attribute {name}')
AttributeError: module openllm has no attribute GenerationInput
Ah this is an oversight on my end. 0.3.9 will come out promptly
Please try again with 0.3.9
Yes, it works fine 🔥 , Thanks @aarnphm!
Describe the bug
When i try to run
openllm
, it crashed with the following exception. I followed the few steps explained inREADME
. Any idea what i might be missing?To reproduce
No response
Logs
No response
Environment
Python 3.10
System information (Optional)
Python 3.10 OS: Ubuntu 22.04 8 CPU RAM 30 GB Accelerators: Quadro M4000 (8 GB GPU memory)