traceloop / openllmetry

Open-source observability for your LLM application, based on OpenTelemetry
https://www.traceloop.com/openllmetry
Apache License 2.0
1.9k stars 185 forks source link

fix(langchain): improve callbacks #1451

Open tibor-reiss opened 3 months ago

tibor-reiss commented 3 months ago

Addresses issues from #1426 -> DONE Basically this just extends tests. The main PRs were #1522 and #1452.

ToDo: decide how to handle async callbacks... -> DONE

tibor-reiss commented 3 months ago

Hi @nirga, I have added some more checks to the tests regarding span relationship - the async tests are broken - need to figure out why... Could it be that the example you mentioned in #1426 was also async? Because I could not reproduce it in the sync case, e.g. the openai.chat (but also bedrock and cohere) spans line up nicely.

I found this: https://github.com/traceloop/openllmetry/blob/main/packages/opentelemetry-instrumentation-openai/opentelemetry/instrumentation/openai/shared/chat_wrappers.py#L62

On another note: I did an example via traceloop: the UI shows me 2 separate traces. However when I click on StrOutputParser, I end up in the 2nd trace - this is also confirmed by the tests. Do you have an explanation?

image

image

nirga commented 3 months ago

Right! @tibor-reiss I've run this: https://github.com/traceloop/openllmetry/blob/main/packages/sample-app/sample_app/langchain_lcel.py Which is indeed async

tibor-reiss commented 3 months ago

@nirga Updated this branch with your work from #1452

nirga commented 2 months ago

@tibor-reiss ok everything is now merged! I like it that you added test, do you want to merge this with main or just close this?

tibor-reiss commented 2 months ago

@tibor-reiss ok everything is now merged! I like it that you added test, do you want to merge this with main or just close this?

@nirga If I saw correctly, you already took some stuff from here / fixed yourself in the previous PR, so it's up to you if you want to add these extras :)