Open hemangjoshi37a opened 5 months ago
It does -- Just change the api endoint in utils.py and the model names in bot.py.
I hope this should be programmable to use custom api endpoint rather than together's API endpoint.
I iterated on sammcj's fork. It should work with any OpenAI formatted endpoint, including local models. I have tested LM Studio and Groq. Just create your own .env file based on the .env.template file, and it should work with Ollama. https://github.com/erik-sv/MoA
I was wondering this also and had a look, the code has quite a few places with hardcoded API base URLs etc.., I had a little hack and got it working with local LLMs here just as an experiment - https://github.com/sammcj/MoA