Open oldbettie opened 1 month ago
This tells me either no one deploys to prod with with SST...
Not everyone use Otel. Otel is considered as EXPERIMENTAL by Next themselves, that's not exactly something that i would consider mandatory to deploy in prod
Here is a link to the discord thread for context https://discord.com/channels/983865673656705025/1286983198819090445
This is the context used by vercel in their runtime https://github.com/vercel/otel/blob/main/packages/otel/src/vercel-request-context/api.ts
One other option (probably a better one now that i think about it) to implement this would be to use a custom wrapper and provide a fake vercel context yourself https://open-next.js.org/config/custom_overrides Something like that (not tested) should do the trick, but be aware that this is a bit risky. It relies on internal stuff from vercel which might change in the future :
// customWrapper.ts
import streamingWrapper from 'open-next/wrappers/aws-lambda.js'
import {WrapperHandler} from 'open-next/types/open-next.js'
const handler : WrapperHandler = async (handler, converter) => {
const defaultHandler = await streamingWrapper.wrapper(handler, converter)
return async (event, context) => {
const symbol = Symbol.for("@vercel/request-context");
const promiseToAwaits : Promise<unknown>[] = []
//@ts-ignore
globalThis[symbol] = {
get: () => {
return {
waitUntil: (promiseOrFunc: Promise<unknown> | (() => Promise<unknown>)) => {
const promise = 'then' in promiseOrFunc ? promiseOrFunc : promiseOrFunc()
promiseToAwaits.push(promise)
},
headers: event.headers,
url: event.rawPath,
}
}
}
const response = await defaultHandler(event, context)
await Promise.all(promiseToAwaits)
return response
}
}
export default {
wrapper: handler,
name: "aws-lambda",
supportStreaming: false,
};
This tells me either no one deploys to prod with with SST...
This is a copy of my conversation in discord but I believe this is a big enough deal to create an issue.
Has anyone got Otel working based on this https://nextjs.org/docs/app/building-your-application/optimizing/open-telemetry
I have wasted days now, I have it working locally with SST but the moment I deploy it I get nothing. I have even done a minrepro to test it with the latest version of SST and it still does not work in any SST remote deployment. I have the min repro working with Vercel, Railway and even self hosted in AWS ecs/ecr. I love SST but this is a deal breaker for me. I have tried both manual and auto instrumentation. I am not running a collector I am just using honeycomb traces endpoint as it doesn't require a collector.
I have load tested it in railway and it performs exactly as expected.
Is this something open-next plans to sortout or should I just switch to Railway or Vercel? How do people in production currently deal with Otel? surely this has been raised before but I have searched high and low and could not find anything. This tells me either no one deploys to prod with with SST...