Open SwagMuffinMcYoloPants opened 1 month ago
I haven't had a chance to submit proper PR's back to this repo, but meanwhile you can check my fork where I have implemented OpenWebUI support and some other stuff that I needed.
@SwagMuffinMcYoloPants thanks for bringing this up.
I’m working a major release on MLX-VLM and this weekend I will be updating FastMLX with lots of goodies. I can add OpenWebUI.
@viljark feel free to propose the changes you want and open a PR with the OpenWebUI support for your fork.
@Blaizzy hey, any update on the goodies 😊
Not yet.
I started scoping it but porting Florence-2 to MLX-VLM had higher priority.
If you can help me with a initial PR, I would appreacite it and take it from there.
I was messing with LMStudio's local server and connecting with openwebui and it was really convenient to connect the two. I don't really need the LMStudio application and would love to just use FastMLX instead. When I try to connect FastMLX as an OpenAI connection, it doesn't return the models. Would it be possible to get FastMLX working with OpenWebUI?