togethercomputer / MoA

Together Mixture-Of-Agents (MoA) – 65.1% on AlpacaEval with OSS models
Apache License 2.0
2.6k stars 356 forks source link

ollama support #20

Open win4r opened 5 months ago

win4r commented 5 months ago

ollama support at this link https://github.com/win4r/MoA

erik-sv commented 5 months ago

@win4r, I made a fork of a fork that should be fully working with any OpenAI formatted endpoint, including local models. I have tested LM Studio and Groq. Just create your own .env file based on the .env.template file, and it should work with Ollama. https://github.com/erik-sv/MoA

JosephShawa commented 3 months ago

How are the result? How have you pushed the envelope?