exo-explore / exo

Run your own AI cluster at home with everyday devices 📱💻 🖥️⌚
GNU General Public License v3.0
6.56k stars 342 forks source link

Problems running in OpenWebUI / AnythingLLM and other frontends #175

Closed robanderson closed 2 weeks ago

robanderson commented 2 weeks ago

Hi there I've been trying out different from end and running into problems with limited implementation of OpenAI API etc.

I found a bug in streaming, where the wrong content type was being set in chatgpt_api.py line 338.

This cause OpenWebUI to not display responses from EXO

        "Content-Type": "application/json",

Should be "text/event-stream" for a streamed response.

        "Content-Type": "text/event-stream",

Also which front ends that require the method /model I've implemented that also.

Add the following to the init of class ChatGPTAPI

cors.add(self.app.router.add_get("/v1/models", self.handle_get_models), {"*": cors_options})

and add the following function.

async def handle_get_models(self, request): models = [] seen_models = set() for model_name, shards in shard_mappings.items(): if model_name not in seen_models: models.append({ "id": model_name, "object": "model", "owned_by": "openai", "ready": True, }) seen_models.add(model_name) return web.json_response(models)

I've never updated code on a public repo before and could try work out what to do if that would help

AlexCheema commented 2 weeks ago

Thanks for this. Made the fixes you suggested.