Open ghost opened 1 year ago
The BasicTracerProvider
is not intended to be used by end users. It's just the common base for web and node.
NodeTracerProvider
adds a node.js specific context manager (AsyncHooks or AsyncLocalStore based depending on node.js version in use).
The BasicTracerProvider
installs no context manger and as a result the NoopContextManager
is used which always returns ROOT_CONTEXT
.
pino instrumentation (and a lot other instrumentations) get the current span from current context via ContextManager. As ROOT_CONTEXT
holds no span pino instrumentation doesn't add traceId/spanId.
So in short if you want to use an instrumentation install a context manager.
If you create spans by manually calling tracer.startSpan()
and pass the span/context around programmatically between all these tracer.startSpan()
calls and you enrich the logs also programmatically (because you have the current span in hand anyway) no ContextManager
is needed.
Thank you so much for the clear explanation @Flarna It was very helpful.
I manually create spans using tracer.startSpan()
to avoid the load of auto-instrumentation
Can we still get the benefits of ContextManager
with manual instrumentation?
Is there a base documentation that clearly explains these constructs for a person with limited experience?
I have changed to NodeTraceProvider
, but I still get the error
@opentelemetry/api: Registered a global for trace v1.6.0.
@opentelemetry/api: Registered a global for context v1.6.0.
@opentelemetry/api: Registered a global for propagation v1.6.0.
@opentelemetry/instrumentation-pino Module pino has been loaded before @opentelemetry/instrumentation-pino so it might not work, please initialize it before requiring pino
I only import pino
once in my Fastify
app which is definitely called after registering instrumentation
- Can we still get the benefits of
ContextManager
with manual instrumentation?
ContextManger
doesn't care about who/how a span is created. But the creator of the span is also responsible to set is active span matching to the application logic using bind()
and/or with()
APIs along with setSpan()
or by using the "all in one helper" startActiveSpan()
.
In fact if you want to use pino instrumentation along with your manual spans you have to use ContextManager
as this is the interface
between your code and the instrumentation.
- Is there a base documentation that clearly explains these constructs for a person with limited experience?
I fear that we are far away of perfect docs, in special for persons with limited experience. Help to improve this is welcome. OTel (or more general monitoring) is not a simple topic. In special the context propagation topic is complicated, language dependent and hard to understand/do right. I doubt simple/easy answers will arrive soon.
- Any recommendations to debug this problem?
Review startup code of your application.
Consider to move OTel initialization code into a dedicated file and preload it using the -r
command line option.
Set a breakpoint at the beginning of the very first pino file (pino.js) and look at the callstack
Thank you so much for all the help @Flarna Once I understand these constructs a little better, I will do my part by updating the docs to make it friendlier for people with limited open telemetry experience.
@Flarna with your guidance & help, I was finally able to configure traces & logs
Following is my reference documentation for the same.
Thank you so much for all the help 🙏
Traces provides mechanism to understand the flow of execution of a process or workflow
Tracing is configured in the module traceConfig.cjs
.
It's a 3 step process:
NodeTraceProvider
Resource
and ContextManager
Resource
identifies the serviceContext Manager
is responsible for Trace Context Management (trace_id
, span_id
) for each requesttracers()
which are then used to generate trace context
for a requestconst resource = Resource.default().merge(
new Resource({
[SemanticResourceAttributes.SERVICE_NAME]: env.serviceName(),
[SemanticResourceAttributes.SERVICE_VERSION]: env.appVersion(),
[SemanticResourceAttributes.DEPLOYMENT_ENVIRONMENT]: env.envName()
})
);
const contextManager = new AsyncLocalStorageContextManager().enable();
provider = new NodeTracerProvider({
resource: resource,
contextManager: contextManager
});
// A. Exporter Code Follows
provider.register();
// B. Instrumentation Code Follows
OTLPTraceExporter
as it is vendor agnosticNew Relic
OpenTelemetry Collector
in Kubernetes
NO changes
are required from in codeOTEL_EXPORTER_OTLP_ENDPOINT
environment variable to backend or collector service url appropriately// A. Exporter Code Follows
const traceExporter = new OTLPTraceExporter({
concurrencyLimit: 10,
compression: 'gzip'
})
spanProcessor = new BatchSpanProcessor(traceExporter, {
maxQueueSize: 1000, // The maximum queue size. After the size is reached spans are dropped.
scheduledDelayMillis: 1000 // The interval between two consecutive exports
})
provider.addSpanProcessor(spanProcessor)
We could instrument traces manually
or automatically
Manual instrumentation
needs following steps done programtically:
startActiveSpan()
or startSpan()
with bind()
or with()
ApisAuto-Instrumentation
is a simplar alternative:
It is done using registerInstrumentations
method in traceConfig.cjs
// B. Instrumentation Code Follows
registerInstrumentations({
instrumentations: [
new PinoInstrumentation(),
new HttpInstrumentation({
requestHook: (span, request) => {
span.updateName(`${request.method} ${request.url}`)
span.setAttribute('request.uuid', request.id)
span.setAttribute(SemanticAttributes.HTTP_USER_AGENT, request.headers['user-agent'])
span.setAttribute(SemanticAttributes.HTTP_HOST, request.hostname)
},
responseHook: (span, response) => {
span.setAttribute(SemanticAttributes.HTTP_STATUS_CODE, response.statusCode)
},
headersToSpanAttributes: ['user-agent', 'http.url', 'http.method', 'http.host', 'http.client_ip']
}),
new FastifyInstrumentation()
]
})
Following Instrumentations are used:
HttpInstrumentation
:
It adds tracing record
with following information - span_id
, trace_id
& trace_flags
to each incoming request
Additional Attributes are added to span as well in its requestHook
& responseHook
PinoInstrumentation
:
Pino is the default logger used in Fastify
This instrumentation adds context of a request
to the corresponding log statement
PinoInstrumentation
attachs to pino
. So, it is essential to ensure that this instrumentation is loaded before pino
is importedtracerConfig.cjs
is loaded as a module in npm scripts "scripts": {
"dev": "NODE_ENV=dev nodemon --inspect=0.0.0.0:9229 -r ./app/utils/otel/traceConfig.cjs app/index.js"
},
.cjs
and not .js
.ES Module
, Pre-loaded modules are only supported for CommonJS
modulesStructures logs are also called Events
pino
which is the default logging library for fastify, by default supports JSON logs
These logs need to be forwarded to a backend
This could be done in multiple ways. eg.
Fluentd
or LogStash
to forward it to a backend or collectorA log exporter
is used here in otel/logs.js
const loggerExporter = new OTLPLogExporter()
const resource = Resource.default().merge(
new Resource({
[SemanticResourceAttributes.SERVICE_NAME]: env.serviceName(),
[SemanticResourceAttributes.SERVICE_VERSION]: env.appVersion(),
[SemanticResourceAttributes.DEPLOYMENT_ENVIRONMENT]: env.envName()
})
)
// To start a logger, you first need to initialize the Logger provider.
loggerProvider = new LoggerProvider({
resource: resource
})
// Add a processor to export log record
loggerProvider.addLogRecordProcessor(new BatchLogRecordProcessor(loggerExporter))
// logging
otelLogger = loggerProvider.getLogger('fastify-logger', env.appVersion())
OTLPLogExporter
is vendor agnostic & can send logs to a collector
or a backend
like New Relic
Connecting this exporter with pino is a 2 steps process:
pino
supports transport
option to forward logs to a backendOTLPLogExporter
, a custom transport is created in app/utils/otel/pinoTransport.js
using pino-abstract-transport
import build from 'pino-abstract-transport'
// Otel logger Configuration as shown above...
otelLogger = loggerProvider.getLogger('fastify-logger', env.appVersion())
// Creating a transport for pino
export default async function (_opts) {
return build(async source => {
for await (const logObject of source) {
otelLogger.emit(mapPinoLogsToOtelLogRecord(logObject))
}
})
}
app/utils/logger.cjs
const transportTargets = []
if (env.isOtelEnabled()) {
// Add the otelTransport to the transportTargets array
transportTargets.push({
target: './otel/logs.js',
level: env.logLevel()
})
}
const transport = pino.transport({
targets: transportTargets
})
app/utils/otels/logs.js
using mapPinoLogsToOtelLogRecord
which is configured in log transport function created above// Pino Log Transport
export default async function (_opts) {
return build(async source => {
for await (const logObject of source) {
otelLogger.emit(mapPinoLogsToOtelLogRecord(logObject))
}
})
}
function mapPinoLogsToOtelLogRecord(pinoData) {
// Transformation Code
// OTLP Log Record Format
const logRecord = {
timestamp: time,
severityNumber: severityNumber,
severityText: severityText,
body: message,
attributes: {
...serializedAttributes,
level // used by NewRelic to filter logs by level
}
}
return logRecord
}
Traces & Logs are connected using trace_id
& span_id
- the trace context
HttpInstrumentation
automatically adds trace context to each incoming requestPinoInstrumentation
adds this context to each log statementBut New Relic does not support this out of the box
To connect traces with logs, sync is required in following configuration variable:
service.name
- Should be same as service.name
in traces. Done using Resource
object in LoggerProvider
& NodeTraceProvider
const resource = Resource.default().merge(
new Resource({
})
)
trace.id
& span.id
in logs. OTLPExporter adds trace_id
& span_id
to each log statementmapPinoLogsToOtelLogRecord
function
// Following Configuration required to connect traces with logs in NewRelic
if (attributes.span_id && env.isOtelBackendNewRelic()) {
serializedAttributes['span.id'] = attributes.span_id
serializedAttributes['trace.id'] = attributes.trace_id
}
Post the above changes, traces & logs would be connected in New Relic UI
@varmehta are you planning on adding this info to the documentation? I think this could help other users in the same situation.
@dyladan I surely can do that, but which documentation? As the above document impact multiple packages.
If you use the NodeTracerProvider
it's not needed to created the AsyncLocalStorageContextManager
and pass it as argument because the NodeTracerProvider
does this internally on default.
Manually creating it is only needed if you use the BasicTracerProvider
.
@Flarna thank you. I'll remove that.
Can anyone assigned this task to @varmehta since this person already provided the detailed documentation above. It's a waste of time for other contributors to read everything in this issue and realized it's about to be solved. Thank you!
What version of OpenTelemetry are you using?
1.6.0
What version of Node are you using?
v20.5.0
What did you do?
Tried Integrating
@opentelemetry/instrumentation-pino
for traces context injection in my fastify app with code:TraceProvider Initialization using
@opentelemetry/sdk-trace-base
precedes the above code:What did you expect to see?
trace_id
&span_id
in logsWhat did you see instead?
Additional context
pino-instrumentation
works only withNodeTracerProvider
or with anyTraceProvider
?