Open batuozdemir opened 5 months ago
maybe I'm missing something, but can't you just add the model to your custom endpoint?
maybe I'm missing something, but can't you just add the model to your custom endpoint?
It's not a model provided by groq, it's an algorithm that sends your query through multiple layers and gives you a final output from an aggregator model. Librechat would need to orchestrate all these queries, send multiple API calls, and give the user the final output.
It's not a model provided by groq, it's an algorithm that sends your query through multiple layers and gives you a final output from an aggregator model. Librechat would need to orchestrate all these queries, send multiple API calls, and give the user the final output.
oh ok, thanks for explaining this
probably out of scope of this application
probably out of scope of this application
I don't think it's out of scope, there are several endpoint/model interactions that could facilitate something like this. it would probably need to function through another package, though
What features would you like to see added?
Mixture of Agents models are showing tremendous results, they can even surpass GPT4o. They are slow to use, but with groq AI they can be lightning fast and very cheap. I would love to use them on LibreChat.
More details
For example, Together AI's MoA does significantly better on AlpacaEval 2.0 than GPT4o. Someone already made a groq port.
Is it feasible to implement this to LibreChat? Is there anything I can do to help?
Which components are impacted by your request?
No response
Pictures
No response
Code of Conduct