arcee-ai / fastmlx

FastMLX is a high performance production ready API to host MLX models.
Other
222 stars 25 forks source link

OpenWebUI doesn't connect to FastMLX #35

Open SwagMuffinMcYoloPants opened 1 month ago

SwagMuffinMcYoloPants commented 1 month ago

I was messing with LMStudio's local server and connecting with openwebui and it was really convenient to connect the two. I don't really need the LMStudio application and would love to just use FastMLX instead. When I try to connect FastMLX as an OpenAI connection, it doesn't return the models. Would it be possible to get FastMLX working with OpenWebUI?

viljark commented 1 month ago

I haven't had a chance to submit proper PR's back to this repo, but meanwhile you can check my fork where I have implemented OpenWebUI support and some other stuff that I needed.

Blaizzy commented 1 month ago

@SwagMuffinMcYoloPants thanks for bringing this up.

I’m working a major release on MLX-VLM and this weekend I will be updating FastMLX with lots of goodies. I can add OpenWebUI.

Blaizzy commented 1 month ago

@viljark feel free to propose the changes you want and open a PR with the OpenWebUI support for your fork.

bhupesh-sf commented 2 weeks ago

@Blaizzy hey, any update on the goodies 😊

Blaizzy commented 1 week ago

Not yet.

I started scoping it but porting Florence-2 to MLX-VLM had higher priority.

If you can help me with a initial PR, I would appreacite it and take it from there.