Open dicksondickson opened 6 months ago
https://console.groq.com/docs/openai
According to this, you should be able to use the Local LLM AnyNode
for connecting to groq IF I make some modifications to the way the endpoint is setup, or quite possibly without. I have to study it a little further because their endpoint path is kinda non-standard but offers the OpenAI standard chatcompletions... goes to get a groq API key
Awesome, I am looking forward to it!
Awesome work! Can you make the node support Groq? They use openai compatible API and serve Llama3, mixtral, gemma models and its free for the time being.
https://groq.com/
Thank you!