Closed Dam-Buty closed 2 months ago
It should be possible now with the context to instrument OpenAI call the same way we do in the [Python SDK](https://docs.getliteral.ai/python-client/api-reference/client#instrument-openai) . This means we just instrument the openai methods out of context, then when we intercept a call we determine where to put the generation : * if we have a step in the context we push it to that step * otherwise we just push the generation without a step
The idea is to simplify the use of the openai instrumentation, making it more like the one in Python where you instrument the lib once, and each call is automatically captured and logged without having to instrument the result.
This is made possible by the AsyncLocalStorage context we have added in the last version.
Old syntax
New syntax