lm-sys / FastChat

An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.
Apache License 2.0
36.44k stars 4.49k forks source link

Woker erorr under python3.8: AttributeError: module 'asyncio' has no attribute 'to_thread' #2811

Open YulunCai opened 9 months ago

YulunCai commented 9 months ago
/workspace# python3 -m fastchat.serve.model_worker
...
"POST /worker_generate HTTP/1.1" 500 Internal Server Error
2023-12-13 04:01:05 | ERROR | stderr | ERROR:    Exception in ASGI application
....
2023-12-13 04:01:07 | ERROR | stderr |     return await dependant.call(**values)
2023-12-13 04:01:07 | ERROR | stderr |   File "/opt/conda/lib/python3.8/site-packages/fastchat/serve/base_model_worker.py", line 206, in api_generate
2023-12-13 04:01:07 | ERROR | stderr |     output = await asyncio.to_thread(worker.generate_gate, params)
2023-12-13 04:01:07 | ERROR | stderr | AttributeError: module 'asyncio' has no attribute 'to_thread'

looks like asyncio.to_thread is only available in python 3.9+ https://github.com/lvxuan263/FastChat/blob/6ac7d76885cae2d06d76bfe7fd8ec5aac6602e6f/fastchat/serve/base_model_worker.py#L206

infwinston commented 9 months ago

thanks for reporting the issue. @merrymercy do you think we should pin python >= 3.9? https://github.com/lm-sys/FastChat/blob/ec9a07ed22110e9686b51fd6ee9bf635b7ce54f8/pyproject.toml#L10

JanMackensen55 commented 8 months ago

If you want to keep fastchat accessible for python 3.8, you can use the solution suggested here: https://stackoverflow.com/questions/68523752/python-module-asyncio-has-no-attribute-to-thread

oluwandabira commented 5 months ago

Just ran into this issue today. I think the requirement for python 3.9+ should be mentioned somewhere on the installation section of the readme or at least put in the pyproject.toml.