langchain-ai / langsmith-sdk

LangSmith Client SDK Implementations
https://smith.langchain.com/
MIT License
350 stars 59 forks source link

Add anthropic wrapper for auto tracing #664

Open NathanHam16 opened 2 months ago

NathanHam16 commented 2 months ago

Feature request

Hi Team, it would be great to add an anthropic wrapper for auto tracing. This issue might have to move to /langsmith-wrappers repo but there seems to be little activity there.

Motivation

Anthropic has some of the best LLMs at the moment and many developers use a mixture of Openai and anthropic in their applications.

hinthornw commented 2 months ago

Ya it'll go here - langsmith-wrappers was my experiment bed from last year. Will take this as a TODO - are you using python or JS?

NathanHam16 commented 2 months ago

Using Python :)

hackgoofer commented 2 months ago

Need this also with python

hinthornw commented 2 months ago

Sweet will try to get it out when I can steal a moment

hackgoofer commented 2 months ago

Are we talking about a day or a week timeline? Trying to decide if I should move to another platform or write my own.

hinthornw commented 2 months ago

If you need something today, this already works:

from anthropic import Anthropic
from langsmith import traceable

anthropic_client = Anthropic()

def reduce(texts: list):
    return {"output": "".join(texts)}

@traceable(run_type="llm", reduce_fn=reduce)
def call_anthropic(system: str, messages: list, model: str, max_tokens: int = 4000):
    with anthropic_client.messages.stream(
        messages=messages, model=model, max_tokens=max_tokens
    ) as stream:
        for text in stream.text_stream:
            yield text
# example

for chunk in call_anthropic(
    system="You are a helpful bot",
    messages=[
        {
            "role": "user",
            "content": "Say hello and solve the Riemann hypothesis for me.",
        }
    ],
    model="claude-3-haiku-20240307",
):
    print(chunk, end="")
Mann1ng commented 1 month ago

+1 - Yes please ! This would be great for typescript also - we're about to soft launch our product which uses claude3 sonnet on the backend. :)

enginoid commented 3 weeks ago

@hinthornw Not sure if you have something in the works I've taken a stab at a PR for this here: https://github.com/langchain-ai/langsmith-sdk/pull/789

Would love early feedback @hinthornw – happy to chat if there's anything gnarly in there (eg. if you were thinking of taking a different approach).