Open Xhale1 opened 6 months ago
Hey @Xhale1 thanks for reporting!
Just to rule something out: Which version of openai
are you using? It seems there were related issues in an older version. Thanks!
Same issue here, using openai: 4.47.3
"openai": "^4.47.3",
Thanks for checking in! Just confirmed that my issue exists with version 4.49.1
which appears to be the latest offered by openai.
Let me know if a reproduction repo would help
A reproduction would be greatly appreciated, thanks :)
This is my first time making an issue reproduction repo, let me know if it works for you: https://github.com/Xhale1/sentry-openai
Hmm yeah, I can reproduce this myself, thanks for the excellent repro app!
I already tried switching the import style to namespace or named import but it doesn't matter. Looks like for some reason API.Completions
within the OpenAI package is undefined
rather than a class. My bet is that import-in-the-middle
plays a role here. Also I wonder if IITM doesn't handle namespace imports within node_modules correctly. Not sure though.
Update: Narrowed it down to the import * as API from "./resources/index.mjs"
statement in /openai@4.49.1/node_modules/openai/index.mjs
. For some reason, this doesn't include Completions
.
@timfish when you have a minute, would you mind taking a look?
It looks like openai/resources/index.mjs
exports two different Completions
classes, one for /completions
and one for /chat/completions
.
For import-in-the-middle
we deduced that for ESM, duplicate named exports were not exported at all but that doesn't appear to be the case here.
I made a minimal reproduction here:
It shows that all you have to do to make your app crash is new OpenAI({ apiKey: OPENAI_API_KEY })
when both Sentry and OpenAI are installed and Sentry is preloaded with --import @sentry/node/preload
.
I've opened a PR for import-in-the-middle
to fix this:
https://github.com/DataDog/import-in-the-middle/pull/103
This should hopefully have been fixed by the numerous PRs recently merged at import-in-the-middle
.
While we wait for this to be released, there is a patch available that combines all the fixes. If anyone can confirm this patch fixes this issue that would be super helpful!
@timfish from my local testing, the patch seems to address the issue!
Confirming that that patch also appears to be working for us, thank you!
Confirmed as fixed in Sentry v8.13.0
There has been some remaining issues reported with openai
due to the way it's "shims" work internally.
For example, someone has reported issues with this code:
import OpenAI from "openai";
const openai = new OpenAI({
apiKey: "fake-api-key",
});
async function doWork() {
const response = await openai.chat.completions.create({
model: "gpt-3.5-turbo",
messages: [
{
role: "user",
content: "Hello, how are you?",
},
],
});
console.log(response);
}
doWork().catch((err) => {
console.error(err);
});
Just tried upgrading again to Sentry 8, version 8.17.0, and started received this error with completion api calls
The next SDK release will have a way to work around this issue.
With v8.18.0 of the SDK just released, there's a new config option which can disable import-in-the-middle
for specific libraries:
import * as Sentry from '@sentry/node';
Sentry.init({
dsn: '__DSN__',
registerEsmLoaderHooks: { exclude: ['openai'] },
});
Hi @timfish, the new method for excluding openai does not appear to work for us. Instead of throwing an error, openai calls seem to freeze any further execution (I might be using the wrong terminology here).
Example:
import OpenAI from "openai";
export const openAI = new OpenAI({
apiKey: "...",
});
const response = await openAI.chat.completions.create({
model: "gpt-3.5-turbo",
messages: [
{
role: "system",
content: "You are a helpful assistant.",
},
{
role: "user",
content: "What's the weather like?",
},
],
});
console.log("Hello there")
In this example, with ESM, registerEsmLoaderHooks: { exclude: ['openai'] },
and Sentry v8.18.0, "Hello there" is never printed and no errors are outputted to the console.
Over the weekend I submitted another PR for import-in-the-middle
which expands exclude
so that we can use regular expressions:
In the long run I think we've got a better solution which should completely remove the need for manually excluding problematic libraries like this. This PR allows us to automatically add only the libraries that Sentry are instrumenting to the include
list:
If you update your dependencies to ensure that you're using import-in-the-middle@1.10.0
you can now pass a regular expressions for exclude
. The Sentry types will be updated in the next release to reflect this.
This allows all paths with openai
in them to be excluded:
registerEsmLoaderHooks: { exclude: [/openai/] },
any updates here? @timfish
still getting this error with the newest sdk version when using late initialization --import @sentry/node/preload
hey @Ivan-juni we are still waiting on https://github.com/getsentry/sentry-javascript/pull/13139 to be fixed for this.
thanks @andreiborza would be super helpful, can't use sentry just because of this error
The workaround for this issue for now is to exclude openai from being instrumented via the registerEsmLoaderHooks
init option:
import * as Sentry from '@sentry/node';
Sentry.init({
dsn: '__DSN__',
registerEsmLoaderHooks: { exclude: [/openai/] },
})
If you're using @sentry/node/preload
you'll need to create your own preload file to pass this option:
preload.mjs
import * as Sentry from '@sentry/node';
Sentry.preloadOpenTelemetry({
registerEsmLoaderHooks: { exclude: [/openai/] },
})
node --import ./preload.mjs ./my-app.mjs
I landed here via this hint:
I'm not sure if Sentry or openai
changed, but this is the latest error message:
TypeError: getDefaultAgent is not a function
at OpenAI.buildRequest (file:///path/to/node_modules/openai/core.mjs:208:66)
at OpenAI.makeRequest (file:///path/to/node_modules/openai/core.mjs:279:44)
I can confirm that registerEsmLoaderHooks: { exclude: [/openai/] }
works as a temporary fix. I'll share yet again how frustrating Sentry v8 has been. Dropping an error-reporting library into my application shouldn't cause seemingly-random errors like this!
@timfish What is the status here again? IIRC we couldn't come up with a solution for the export pattern that openai uses in their package right? Or did I miss something.
Other than that, when https://github.com/getsentry/sentry-javascript/pull/13139 is merged, people should be able to set registerEsmLoaderHooks: { onlyIncludeInstrumentedModules: true }
to escape any random other packages from being patched.
I'll share yet again how frustrating Sentry v8 has been. Dropping an error-reporting library into my application shouldn't cause seemingly-random errors like this!
@nwalters512 Fully acknowledged! I hope you believe me when I say we are doing everything in our power to improve this situation. We took partial ownership of IITM in the Node.js org and we are regularly contributing upstream to OTEL. This is something not only improving the Sentry user experience but improves the entire ecosystem. These are not easy problems to solve (no excuse but to give some insight)! I hope you understand.
IIRC we couldn't come up with a solution for the export pattern that openai uses in their package right? Or did I miss something.
I seem to remember it being caused by this issue which is a fundamental problem with import-in-the-middle
.
v8.29.0 of the Node SDK adds the new onlyIncludeInstrumentedModules
option to registerEsmLoaderHooks
.
import * as Sentry from '@sentry/node';
Sentry.init({
dsn: '__PUBLIC_DSN__',
registerEsmLoaderHooks: { onlyIncludeInstrumentedModules: true },
});
When set to true
, import-in-the-middle
will only wrap ESM modules that are specifically instrumented by OpenTelemetry plugins. This is useful to avoid issues where import-in-the-middle
is not compatible with some of your dependencies.
This feature will only work if you Sentry.init()
the SDK before the instrumented modules are loaded. This can be achieved via the Node --register
and --import
CLI flags or by loading your app code via async import()
after calling Sentry.init()
.
Please let me know if this helps solve the import-in-the-middle
incompatibility issues you are experiencing!
@timfish, it seems that the option added in v8.29.0
is onlyIncludeInstrumentedModules
: https://github.com/getsentry/sentry-javascript/blob/bcf571d9954094be76a99edbb12c23eff7f7b5dc/packages/node/src/types.ts#L20
For others: there's also relevant issue for the OpenAI package with some additional discussion: https://github.com/openai/openai-node/issues/903. And a PR for import-in-the-middle
to try and add tests for this case: https://github.com/nodejs/import-in-the-middle/pull/113
Oh wow, thanks @torickjdavis, that means the release notes are wrong too. I'll correct these!
Haha, I was a bit confused by the release notes too, thanks for the quick fix! (and thanks for your work here as always!)
Off topic so feel free to disregard: how confident do you and the team feel in the Sentry + ESM stability now that this new option is released? Seems #12806 might be pretty much squared away?
🚀 Awesome! I really should look, but haven't yet for Sentry's documentation process, but relevant documentation for onlyIncludeInstrumentedModules
should probably be added as another option for skipping instrumentation across the different guides.
how confident do you and the team feel in the Sentry + ESM stability now that this new option is released?
This new option negates all of the fundamental (and not-fixable) issues we've found with import-in-the-middle
which have been the major pain points. If you use this option and ensure you init the SDK (and therefore the otel instrumentations) before you app code is loaded, import-in-the-middle
should no longer cause problems.
There is a proposal for new loader hooks so this will likely continue to be a moving target.
relevant documentation for
onlyIncludeInstrumentedModules
should probably be added as another option
We plan to add this to the documentation, we're just trying it out in apps ourselves and waiting for confirmation from more users that it is in fact solving the issues it was suppoed to.
Sentry want it to be as simple and painless as possible to start using the SDKs so we are still left with things we want to improve here. This fix is currently opt-in which is far from ideal. It's not clear from the original error above that this new option will help here!
Hello! I'm sorry for not closing the loop here earlier.
Using the onlyIncludeInstrumentedModules
option is working great and has solved the original issue! I'm also happy to report my team has fully migrated to v8 thanks to this option 🎉
I'll leave this issue open on the oft chance @timfish wants to use it to track openai compatibility without this option, but feel free to close whenever you see fit :)
Thank you everyone for your dedication to this issue and to the larger open source community :)
How can i use onlyIncludeInstrumentedModules
with the late initialization (preload)?
@Ivan-juni you currently can't but we could provide that option via an env variable. I added this to our backlog: https://github.com/getsentry/sentry-javascript/issues/14330
Is there an existing issue for this?
How do you use Sentry?
Sentry Saas (sentry.io)
Which SDK are you using?
@sentry/node
SDK Version
8.8.0
Framework Version
Node 20.14.0
Link to Sentry event
https://trainwell.sentry.io/issues/5463070600/?project=6067364&query=is%3Aunresolved+issue.priority%3A%5Bhigh%2C+medium%5D&referrer=issue-stream&statsPeriod=14d&stream_index=1
SDK Setup
Steps to Reproduce
openai
packageThe addition of the following code is what triggers the issue:
Expected Result
App builds with sentry instrumentation and no errors.
Actual Result
I receive the following error:
This might be related to https://github.com/getsentry/sentry-javascript/issues/12237 however it appears to be a unique issue unrelated to initialization.