Open madhudson opened 4 months ago
I also would like groq support in Fabric !
my temporary solution https://github.com/danielmiessler/fabric/issues/361#issuecomment-2136356010
you can try litellm what I am using, refer to https://github.com/danielmiessler/fabric/issues/361#issuecomment-2138999111
I was able to get this working based on the example from @hobbytp. Exporting the values though did not seem to work for me and I instead had to put them in the ~/.config/fabric/.env
file. This then worked for me.
Of interesting related note though when using the mixtral model from groq I get back responses that match what I would expect from the pattern used. When I use the mixtral model locally through ollama the response I get back is not even remotely like the pattern. It is entirely possible the local side is related to my computer not being as good as is probably needed for mixtral, but I suspect it is more to do with how the openai piece works vs the ollama piece.
What do you need?
I like to use Groq occasionally as it's pricing model is pretty good