Closed 0wwafa closed 4 months ago
I'm not really sure how to process that. Are you talking about ExUI?
Yes, sorry.. I realize now I posted it in the wrong repository.
It's fine.
But ExUI is really closely integrated with ExLlamaV2. I do plan to change it over to using tabbyAPI as a backend, which would mean it could also connect to other OAI-compatible servers, including ones running llama.cpp, or even ChatGPT. But I don't think I'll have time for that for at least a couple of months.
It would be great to have this interface locally to use models on CPU with llama.cpp