langchain-ai / langsmith-sdk

LangSmith Client SDK Implementations
https://smith.langchain.com/
MIT License
346 stars 59 forks source link

[WIP] add `wrap_anthropic` #789

Closed enginoid closed 1 day ago

enginoid commented 2 weeks ago

This is incomplete at the moment but posting this for feedback as a work in progress. Currently implemented is a rudimentary support through async/sync and streaming/non-streaming, backed up by tests.

TODOs before leaving draft stage

Currently, the added tests are passing. Things that are left to do on this before undrafting:

Thoughts/caveats

hinthornw commented 2 weeks ago

Thanks for the PR and for getting the integration started!

I think the challenging bit that has prevented me from shipping an anthropic wrapper is their combo context manager streaming setup that makes patching the client less pretty -


with client.messages.stream(
    max_tokens=1024,
    messages=[{"role": "user", "content": "Hello"}],
    model="claude-3-opus-20240229",
) as stream:
  for text in stream.text_stream:
      print(text, end="", flush=True)
``

As much as possible we'd like to not alter the behavior of this kind of statement
enginoid commented 2 weeks ago

Gotcha – I have tried to avoid this one for now because it did look more involved and I'm not sure I'll be able to commit the time. Is it an option in your view to merge this without support for the stream API?

Separately, I'd love a bit of guidance from you on how to structure the outputs, since I'm not sure what the target schema is for it to look and work right in LangSmith, eg. to show the right output from a tool call in "Outputs". I've shared a Loom here and would love it if you could take a look - then I can change the reducer to get the outputs consistent.

enginoid commented 1 day ago

We're using this internally and it's working fine, except that we have to use "Raw output" rather than "Output" to see the outputs. I'd love to help get this merged upstream but I have to take it off my plate for now. If someone can help with the stuff I posted in the last comment at some point, then feel free to re-engage me and I can see if I can find some capacity to drive it home.