Open win4r opened 5 months ago
@win4r, I made a fork of a fork that should be fully working with any OpenAI formatted endpoint, including local models. I have tested LM Studio and Groq. Just create your own .env file based on the .env.template file, and it should work with Ollama. https://github.com/erik-sv/MoA
How are the result? How have you pushed the envelope?
ollama support at this link https://github.com/win4r/MoA