Open andreibondarev opened 6 months ago
@mattlindsey There's no way they're hosting OpenAI models, because other than GPT-2 none of them are open source. Only open-source ones: https://console.groq.com/docs/models.
@andreibondarev Actually the above example seems to work great with this:
class OpenAI < Base
DEFAULTS = {
n: 1,
temperature: 0.0,
chat_completion_model_name: "mixtral-8x7b-32768",
embeddings_model_name: "text-embedding-3-small"
}.freeze
Output:
3.2.2 :007 > llm = Langchain::LLM::OpenAI.new(api_key: 'gsk_xxxx', llm_options: { uri_base: 'https:/
/api.groq.com/openai/', model: 'mixtral-8x7b-32768'})
=>
#<Langchain::LLM::OpenAI:0x0000000109c1d518
...
3.2.2 :008 >
3.2.2 :009 > llm.chat(messages: [{role: "user", content: "What is the meaning of life?"}]).completion
=> "The meaning of life is a question that has puzzled philosophers, scientists, and thinkers for centuries. There is no one definitive answer, as the meaning of life is subjective and can vary greatly from person to person. However, here are some possible
So it should work by fixing something. The initializer doesn't seem to be respecting the model
param.
@mattlindsey I don't think it's a good idea. It would be kind of odd to instantiate an Langchain::LLM::OpenAI
class and then interface with mixtral
model via Groq.
@andreibondarev That seem to be what they tell you to do in their FAQ, and it works (with a fix, it seems). But maybe you'd prefer a Groq LLM class.
I discovered using parameters as follows works currently, in case anyone needs it now:
llm = Langchain::LLM::OpenAI.new(api_key: 'your_groq_key_here', llm_options: { uri_base: 'https:/
/api.groq.com/openai/'})
llm.chat(messages: [{role: "user", content: "Tell me a story?"}], model: 'mixtral-8x7b-32768').completion
This does not work, however (gives : undefined method
encode in openai_validator.rb:73:in token_length`)
llm.embed(text: "foo bar", model: 'mixtral-8x7b-32768').embedding
For now (for chat completion) it seems Groq Cloud is matching OpenAI's API. So the Langchain Groq adapter could use the existing ruby-openai
rubygem dependency for the subset of API features currently available on https://api.groq.com/v1/openai
The test/fixtures/vcr_cassettes
folder of groq-ruby should give some examples of API calls that are available.
Currently it is:
I'd suggest creating a Langchain Groq class (at least to explicitly document what features are known to be available and are documented) and try using ruby-openai
or just low level faraday
calls.
@andreibondarev The following should work pretty easily, and I can do a PR if you want:
module Langchain::LLM
# LLM interface for Groq OpenAI compatible services
#
# Usage:
# groq = Langchain::LLM::GroqOpenAI.new(
# api_key: ENV["GROQ_API_KEY"],
# llm_options: {},
# default_options: {}
# )
class GroqOpenAI < OpenAI
P.S. The Assistant and Weather tool appear to work well using OpenAI class now with model: mixtral-8x7b-32768'
, since chat and (apparently) function calling are supported.
For anyone else who just wants a quick Groq solution who might not know where the diff defaults/overrides go (e.g. me) this seems to work:
@groq = llm = Langchain::LLM::OpenAI.new(
api_key: ENV["GROQ_API_KEY"],
llm_options: {
uri_base: "https://api.groq.com/openai/"
},
default_options: {
chat_completion_model_name: "llama3-70b-8192"
}
)
messages = [{:role=>"system", :content=>"I like to solve maths problems."}, {"role"=>"user", "content"=>"What is 2+2?"}]
@groq.chat(messages:).completion
# => "That's an easy one! The answer is... 4!"
I'll give it a try. This might already work using Langchain::LLM::OpenAI since the Groq technical FAQ says:
Something like this, although this doesn't seem to work yet:
Possibly the ruby-openai gem needs to support it somewhere around here: https://github.com/alexrudall/ruby-openai/blob/f8a4482f7012e27f791c9259bde6fb1cda191e82/lib/openai/http.rb#L87