Open AltafHussain4748 opened 2 months ago
Assigning to @getsentry/support for routing ⏲️
Routing to @getsentry/product-owners-insights for triage ⏲️
@AltafHussain4748 thank you for the bug report, I just added to our backlog of work to triage next week. We'll update soon.
Hey @AltafHussain4748 !
Your code looks good, I think the only thing your are missing that you need to enable tracing. (Setting tracing_sample_rate=1.0
in your init()
call.)
You can check if data is sent to Sentry by setting debug=True
to your init()
call and then you see something like this message in your console:
Sending envelope [envelope with 1 items (transaction)] project:5461230 host:o447951.ingest.sentry.io
(The LLM data is in the transaction envelope)
Hope this help!
Hey, sorry I didn't notice this issue before and created a new one, but it doesn't work because of this: https://github.com/getsentry/sentry-python/issues/3496
Cool @vetyy , thanks for linking!
@AltafHussain4748
This is the configuration I am using
sentry_sdk.init(
dsn=dsn,
release=release,
environment=environment,
send_default_pii=True,
enable_tracing=True,
integrations=[
OpenAIIntegration(
include_prompts=False, # Exclude prompts from being sent to Sentry, despite send_default_pii=True
tiktoken_encoding_name="cl100k_base",
)
],
)
Don't forget to specify tiktoken_encoding_name
otherwise it will calculate 0 tokens.
def count_tokens(self, s):
# type: (OpenAIIntegration, str) -> int
if self.tiktoken_encoding is not None:
return len(self.tiktoken_encoding.encode_ordinary(s))
return 0
and also make sure you have pip install tiktoken
installed.
Hey @AltafHussain4748 !
Your code looks good, I think the only thing your are missing that you need to enable tracing. (Setting
tracing_sample_rate=1.0
in yourinit()
call.)You can check if data is sent to Sentry by setting
debug=True
to yourinit()
call and then you see something like this message in your console:Sending envelope [envelope with 1 items (transaction)] project:5461230 host:o447951.ingest.sentry.io
(The LLM data is in the transaction envelope)
Hope this help!
Thanks for your reply even with this parameter i was not able to make it work
Problem Statement
I just experimented LLM monitoring and could not make it work with AsyncOpenAI.
Below is my code
My sentry init is
Function i am using:
Versions
Solution Brainstorm
No response
Product Area
Insights