traceloop / openllmetry

Open-source observability for your LLM application, based on OpenTelemetry
https://www.traceloop.com/openllmetry
Apache License 2.0
2.55k stars 232 forks source link

🚀 Feature: Select field to send to APM or possibility to obfuscate #2171

Open gauthiersiri opened 3 days ago

gauthiersiri commented 3 days ago

Which component is this feature for?

Anthropic Instrumentation

🔖 Feature description

Sometimes, some fields, like the prompt or results (completion), which are reported thru openllmetry, may contains confidentials values would need not to be reported or maybe obfuscated (replaced by **** ?).

🎤 Why is this feature needed ?

Some customers may want to observe there genAI apps but don't want confidential data at the APM level, which may appears in the prompt or the result.

✌️ How do you aim to achieve this?

give the possibility to omit or obfuscate a field, maybe through env vars?

🔄️ Additional Information

No response

👀 Have you spent some time to check if this feature request has been raised before?

Are you willing to submit PR?

Yes I am willing to submit a PR!

nirga commented 3 days ago

Thanks @gauthiersiri! We actually have an option to disable content tracing. Did you had something else in mind?

gauthiersiri commented 3 days ago

Hi Nirga, yes we actually just saw that a few min ago :D But it seems to hide only the prompt, not the answer, or maybe we missed something?

nirga commented 3 days ago

Both