groq / groq-python

The official Python Library for the Groq API
Apache License 2.0
353 stars 23 forks source link

Support for smith.langchain.com tracing #28

Closed andraz closed 1 month ago

andraz commented 4 months ago

When connecting to Groq using the langchain_openai lib like this:

from langchain_openai import ChatOpenAI
from agent.config import groq_api_key

model_name = "mixtral-8x7b-32768"
api_base_url = "https://api.groq.com/openai/v1"

model = ChatOpenAI(
    model=model_name, openai_api_base=api_base_url, openai_api_key=groq_api_key
)

with .env file:

LANGCHAIN_TRACING_V2=true
LANGCHAIN_ENDPOINT=https://api.smith.langchain.com
LANGCHAIN_API_KEY=ls_********************************************

GROQ_API_KEY=gsk_********************************************

I am able to trace the logs of my tool calling:

image

How to enable this feature using the initialization below?

from groq import Groq
import os
import json

client = Groq(api_key = os.getenv('GROQ_API_KEY'))
MODEL = 'mixtral-8x7b-32768'

# ...

source: https://console.groq.com/docs/tool-use#example

jacoblee93 commented 4 months ago

Hey @andraz!

You would want to use the langchain-groq package as shown here:

https://python.langchain.com/docs/integrations/providers/groq/

Or you can use the @traceable decorator here:

https://docs.smith.langchain.com/tracing/faq/logging_and_viewing#the-traceable-decorator

hozen-groq commented 4 months ago

Hi, @andraz! 😁 Does the above answer your question?

andraz commented 4 months ago

Hi @jacoblee93 and @hozen-groq, thanks for your support. Sadly I had no success in making this work. Neither using langchain-groq and adding a @traceable decorator in front of LLM call helped, execution still produces zero logs.

jacoblee93 commented 3 months ago

And env vars are set? Should be the exact same flow, and I'm quite surprised traces are working elsewhere in that case...