Open michael-hhai opened 2 months ago
Thanks @michael-hhai! Do you mean that the auto-instrumentation doesn't work?
Yes. Correct.
Yes, well known. That's why we have this: https://www.traceloop.com/docs/openllmetry/tracing/js-force-instrumentations
It's indeed more of an inherent problem with Node.js, but hopefully will be fixed.
For what it's worth, I do not have a minimal reproduction immediately handy but I don't think the above linked workaround works for ES modules either.
@michael-hhai I'm 90% sure it works (wanted to say 99% but maybe I need to be more modest)
I have the following reproduction of the issue with ES modules that I've tried to make as minimal as I can:
tracer-module
package as follows:
a. Create index.ts
as:
import { OpenAI } from 'openai';
import * as traceloop from "@traceloop/node-server-sdk";
class Tracer { public init(): void { traceloop.initialize({ baseUrl: "http://example-url-does-not-exist.com/opentelemetry", apiKey: "FAKE-API-KEY", disableBatch: true, instrumentModules: { openAI: OpenAI, }, }); }
public trace(fn: () => void): void { traceloop.withAssociationProperties( { thing: "thing", }, fn, ); } }
export const tracer = new Tracer();
b. Create `package.json` as:
```json
{
"name": "tracer-module",
"version": "1.0.0",
"main": "dist/index.js",
"type": "module",
"exports": {
".": "./dist/index.js"
},
"scripts": {
"build": "tsc"
},
"dependencies": {
"@anthropic-ai/sdk": "^0.26.1",
"@aws-sdk/client-bedrock-runtime": "^3.632.0",
"@azure/openai": "^2.0.0-beta.1",
"@google-cloud/aiplatform": "^3.26.0",
"@google-cloud/vertexai": "^1.4.1",
"@pinecone-database/pinecone": "^3.0.0",
"@qdrant/js-client-rest": "^1.11.0",
"@traceloop/node-server-sdk": "^0.10.0",
"chromadb": "^1.8.1",
"cohere-ai": "^7.12.0",
"langchain": "^0.2.16",
"llamaindex": "^0.5.17",
"openai": "^4.56.0"
}
}
c. Create tsconfig.json
as:
{
"compilerOptions": {
"target": "ES6",
"sourceMap": true,
"module": "ESNext",
"strict": true,
"esModuleInterop": true,
"moduleResolution": "node",
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"outDir": "./dist",
"typeRoots": ["./node_modules/@types", "./types"]
},
"include": ["./**/*.ts", "./custom.d.ts"],
"exclude": ["node_modules"]
}
npm install && npm run build
.npm link
.js-test
or something like that) as follows:
a. Create index.ts
like:
import { BatchInterceptor } from '@mswjs/interceptors'
import { ClientRequestInterceptor } from '@mswjs/interceptors/ClientRequest'
import { XMLHttpRequestInterceptor } from '@mswjs/interceptors/XMLHttpRequest'
const interceptor = new BatchInterceptor({ name: 'my-interceptor', interceptors: [ new ClientRequestInterceptor(), new XMLHttpRequestInterceptor(), ], })
interceptor.apply()
interceptor.on('request', ({ request, requestId, controller }) => { console.log(request.method, request.url) })
import { tracer } from 'tracer-module'; tracer.init();
import OpenAI from 'openai';
// src/index.ts const helloWorld = (): string => { return "Hello, World!"; };
const main = async () => { const resolvedTracer = await tracer; // Await the tracer if it is a promise
await resolvedTracer.trace(async () => { // Example call to OpenAI const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY, });
try {
const response = await openai.chat.completions.create({
model: "gpt-4o",
messages: [{ role: 'user', content: "Say Hello, World!" }],
max_tokens: 5,
});
console.log(response.choices[0]?.message?.content);
} catch (error) {
console.error("Error calling OpenAI API:", error);
}
// Original helloWorld function call
console.log(helloWorld());
});
};
main().catch((error) => console.error(error));
b. Create `tsconfig.json` like:
```json
"compilerOptions": {
"target": "es2020",
"sourceMap": true,
"module": "commonjs",
"strict": true,
"esModuleInterop": true,
"moduleResolution": "node",
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"outDir": "./dist",
"typeRoots": ["./node_modules/@types", "./types"]
},
"include": ["./**/*.ts", "./custom.d.ts"],
"exclude": ["node_modules"]
}
c. Create package.json
like:
{
"type": "module",
"devDependencies": {
"tsx": "^4.16.5"
},
"dependencies": {
"@mswjs/interceptors": "^0.34.0",
"@opentelemetry/instrumentation": "^0.52.1",
"@traceloop/node-server-sdk": "^0.10.0",
"honeyhive": "^0.6.4",
"node-request-interceptor": "^0.6.3",
"openai": "^4.54.0",
"tracer-module": "file:../tracer-module"
}
}
npx tsx index.ts
. You should see output like:
Traceloop exporting traces to http://example-url-does-not-exist.com/opentelemetry
POST https://api.openai.com/v1/chat/completions
Hello, World!
Hello, World!
Note that there is no call to http://example-url-does-not-exist.com/opentelemetry/v1/traces
.
"type": "module"
from the package.json
so that it now looks like:
{
"devDependencies": {
"tsx": "^4.16.5"
},
"dependencies": {
"@mswjs/interceptors": "^0.34.0",
"@opentelemetry/instrumentation": "^0.52.1",
"@traceloop/node-server-sdk": "^0.10.0",
"honeyhive": "^0.6.4",
"node-request-interceptor": "^0.6.3",
"openai": "^4.54.0",
"tracer-module": "file:../tracer-module"
}
}
npx tsx index.ts
. You should see output like:
Traceloop exporting traces to http://example-url-does-not-exist.com/opentelemetry
POST https://api.openai.com/v1/chat/completions
Hello, World!
Hello, World!
POST http://example-url-does-not-exist.com/opentelemetry/v1/traces
Note that this does make a call to POST http://example-url-does-not-exist.com/opentelemetry/v1/traces
.
An interesting thing here is that this behavior is dependent on the tracer-module
being an external dependency and not just another file within the same project. If the tracer module is just another file within the same project, then it will trace correctly regardless of whether or not the project has "type": "module"
or not.
Thanks! I think it's related to https://github.com/openai/openai-node/issues/903
A possible workaround can be -
import { register } from "node:module";
register("import-in-the-middle/hook.mjs", import.meta.url, {
parentURL: import.meta.url,
data: { include: ["openai"]},
});
And then when running node for example you’d import that file with node --import ./loader.js
I've gone down that rabbit hole and I can't really get anything like that to work. Do you know what exactly needs to happen in order for openllmetry-js
(really opentelemetry-js
) to be able to successfully trace the calls? Do you know the difference in code execution between ES modules and non-ES modules? Because I don't actually currently know the answers to those questions, so I feel as if I'm kinda randomly mashing buttons at the moment.
It is indeed a PIA to get this to work, but it's definitely possible (I've done it). A couple of notes:
openai
package, it must be external.import "openai"
. The easiest way to do that is to dynamically import the file that does the import "openai"
, from the file that does the instrumentation/tracing setup.node --import ./loader.js
trick @nirga mentions above.import-in-the-middle
dependency yourself.import-in-the-middle
is not bundled.Would you mind spelling out what exactly all of that entails in terms of the minimal reproduction posted above? I'm not doing any sort of bundling there, so at least is not an issue.
@ericallam Any help with the above? I still have no luck converting the minimal reproduction above into something that works (i.e. calls out to POST http://example-url-does-not-exist.com/opentelemetry/v1/traces
when package.json
contains "type": "module"
). I don't know if I'm deciphering what you're saying correctly.
It seems like OpenLLMetry auto-instrumentation doesn't work with ES modules. For what it's worth, this is chiefly an upstream problem with opentelemetry-js (see also https://github.com/open-telemetry/opentelemetry-js/issues/4845). Just making a record of it here as well.