Open twmht opened 8 months ago
I meet the same problem
Hi sorry we've changed to the serving/hosting logic from original fastchat
(since it involves so many codes and files, we wish to make it easier and quicker to host new demo).
to hosting a backend endpoint: https://github.com/Luodian/Otter/blob/main/pipeline/serve/deploy/otterhd_endpoint.py
The frontend code: https://huggingface.co/spaces/Otter-AI/OtterHD-Demo/blob/main/app.py
But if you aim at serving models in old way, you could checkout to branch: https://github.com/Luodian/Otter/blob/model_serving
If you meet errors about gradio
, you may need to fix the gradio version and see the updated doc here:
https://github.com/Luodian/Otter/blob/model_serving/docs/server_host.md
Following the instructions in https://github.com/Luodian/Otter/blob/main/docs/server_host.md
any idea?