tmc / langchaingo

LangChain for Go, the easiest way to write LLM-based programs in Go
https://tmc.github.io/langchaingo/
MIT License
3.75k stars 521 forks source link

Unable to Use LLaMA Model with Groq API in langchaingo #890

Open Ashad001 opened 2 weeks ago

Ashad001 commented 2 weeks ago

Description

I am trying to interact with the LLaMA model llama3-8b-8192 using Groq. However, I'm encountering issues with the integration, and it seems like the library might be prompting for an OpenAI API key instead of using the Groq API key.

Steps to Reproduce

  1. Initialize the Groq client with the LLaMA model and Groq API Url as specified in this example.
  2. Attempt to generate a response from the model.

Expected Behavior

The LLaMA model should generate a response based on the prompt provided using the Groq API.

Actual Behavior

The library seems to be prompting for an OpenAI API key instead of using the Groq API key. image

Environment

langchaingo version: v0.1.10 Go version: go version go1.22.3 windows/amd64

Additional Context

I have confirmed that the API key is correct and the Groq API is accessible. The issue seems to stem from the way langchaingo handles the API keys or model initialization.

Any guidance or support on resolving this issue would be greatly appreciated. Thank you!!!

tmc commented 2 weeks ago

Can you supply a code sample triggering this?

Ashad001 commented 2 weeks ago

Apologies for the confusion. I realized I had to add the openai.WithToken parameter in openai.New method (this example) even though I m not using OpenAI but Groq. This was a bit confusing. Perhaps this could be made clearer in the documentation? Thanks!

tmc commented 2 weeks ago

Definitely need to make this more clear, there should be a dedicated docs page for this I imagine.

Open to making this type of contribution?

Ashad001 commented 2 weeks ago

I m new to Go, but I'm sure it'll be a worthwhile experience! Will start working on it soon.