This adds the possibility to add tags/metadata to the OpenAI instrumentation, at the call level. The original feature (at the instrumentation level) is retained.
This is meant to be used like so :
const instrumentedOpenAi = client.instrumentation.openai();
await instrumentedOpenAi.chat.completions.create(
{
model: 'gpt-3.5-turbo',
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'What is the capital of Canada?' }
]
},
{
literalaiTags: ['tag3', 'tag4'],
literalaiMetadata: { otherKey: 'otherValue' }
}
);
This adds the possibility to add tags/metadata to the OpenAI instrumentation, at the call level. The original feature (at the instrumentation level) is retained.
This is meant to be used like so :