Open pedrocarnevale opened 9 months ago
Assigning to @getsentry/support for routing ⏲️
hi @pedrocarnevale, could you share your next.config.js
? This is so we can try and reproduce this. What next.js/webpack verson are you using?
Can confirm. Ours has been doing that as well. Currently playing with different options to see which one will work.
/** @type {import('next').NextConfig} */
const nextConfig = {
reactStrictMode: true,
// logging: {
// fetches: {
// fullUrl: true
// }
// }
}
module.exports = nextConfig
// Injected content via Sentry wizard below
const { withSentryConfig } = require("@sentry/nextjs");
module.exports = withSentryConfig(
module.exports,
{
// For all available options, see:
// https://github.com/getsentry/sentry-webpack-plugin#options
// Suppresses source map uploading logs during build
silent: true,
org: "receiptify",
project: "javascript-nextjs",
},
{
// For all available options, see:
// https://docs.sentry.io/platforms/javascript/guides/nextjs/manual-setup/
// Upload a larger set of source maps for prettier stack traces (increases build time)
widenClientFileUpload: true,
// Transpiles SDK to be compatible with IE11 (increases bundle size)
transpileClientSDK: true,
// Routes browser requests to Sentry through a Next.js rewrite to circumvent ad-blockers (increases server load)
tunnelRoute: "/monitoring",
// Hides source maps from generated client bundles
hideSourceMaps: true,
// Automatically tree-shake Sentry logger statements to reduce bundle size
disableLogger: true,
}
);
Seems like disableServerWebpackPlugin: true
worked!
Thanks for reporting @receiptify-ai!
Can confirm. Ours has been doing that as well. Currently playing with different options to see which one will work.
Can you share your Next.js version, webpack version, and sentry sdk version? We need this to try and reproduce this issue.
Seems like disableServerWebpackPlugin: true worked!
How much big is your server-side codebase?
This stops sourcemaps generation + uploading for your server-side code, which means you get way worse stacktraces potenially.
If you set disableServerWebpackPlugin: true
and then follow this guide to generate server-side sourcemaps, do you still get the OOM error? This is so we can see if the issue is with sourcemaps generation, or with sentry uploading! If the issue is sourcemaps generation, then it's def a next.js and Vercel bug, we need to raise an issue with them.
"@sentry/nextjs": "^7.99.0",
"next": "^14.1.0"
Not sure about webpack but it seems like peer dependency of "@sentry/webpack-plugin": "1.21.0"
is "webpack": ">= 4.0.0"
. This is from my package-lock.json.
webpack: (config, { isServer }) => {
if (isServer) {
config.devtool = 'source-map'
}
return config
},
This causes OOM.
Not sure about webpack but it seems like peer dependency of "@sentry/webpack-plugin": "1.21.0" is "webpack": ">= 4.0.0". This is from my package-lock.json.
you can run npm why next
to get the resolved next version used! That will also tell us the webpack version because next vendors this in.
This causes OOM.
looks like this is a next.js + Vercel issue then. If you can get the exact next version via npm why
we can try and get a minimal reproduction together and file an issue upstream.
Just to ask again, how much big is your server-side codebase? Number of API routes, avg lines of code? This might help explain why OOM is happening (helps build better reproduction for Vercel folks as well).
thanks a lot of helping us debug!!
next@14.1.0
node_modules/next
next@"^14.1.0" from the root project
peer next@"^10.0.8 || ^11.0 || ^12.0 || ^13.0 || ^14.0" from @sentry/nextjs@7.99.0
node_modules/@sentry/nextjs
@sentry/nextjs@"^7.99.0" from the root project
peer next@">= 13.0.0" from next-client-cookies@1.1.0
node_modules/next-client-cookies
next-client-cookies@"^1.1.0" from the root project
The majority ouf our code are using server actions.
lib
225k - contains most of our db and server side logic.
api
135k - contains mostly redirects and callbacks for third party services. These callbacks call the files inside of the lib
directory.
app
334k - contains server components (includes api
)
I have a similar issue the OOM starts when i bump next from 14.0.5-canary.19
to 14.0.5-canary.20
: https://github.com/vercel/next.js/compare/v14.0.5-canary.19...v14.0.5-canary.20
But in my case the when i try to add:
webpack: (config, { isServer }) => {
if (isServer) {
config.devtool = 'source-map'
}
return config
},
No OOM happens.
If can help:
I use sentry version: @sentry/nextjs@7.100.1
My sentry config is trivial:
nextConfig,
{
// For all available options, see:
// https://github.com/getsentry/sentry-webpack-plugin#options
// Suppresses source map uploading logs during build
silent: true,
org: '..',
project: '..',
},
{
// disableServerWebpackPlugin: true,
// For all available options, see:
// https://docs.sentry.io/platforms/javascript/guides/nextjs/manual-setup/
// Upload a larger set of source maps for prettier stack traces (increases build time)
widenClientFileUpload: true,
// Transpiles SDK to be compatible with IE11 (increases bundle size)
transpileClientSDK: false,
// Routes browser requests to Sentry through a Next.js rewrite to circumvent ad-blockers (increases server load)
// TODO: Use tunneling when is resolved
// https://github.com/getsentry/sentry-javascript/issues/8293
// tunnelRoute: "/web/monitoring",
// Hides source maps from generated client bundles
hideSourceMaps: true,
// Automatically tree-shake Sentry logger statements to reduce bundle size
disableLogger: true,
}
Also setting disableServerWebpackPlugin
to true
resolve the issue.
Maybe can help i use standalone
i can reproduce this problem only running with Docker using the recommended node:18-alpine
.
@gffuma This actually helps a lot in narrowing down the problem. Thank you very much! That particular Next.js release includes a PR that sounds like it could cause excessive memory consumption when generating source maps for the server. I have contacted vercel and we are trying to figure this out.
If you could provide a reproduction example inside a repository that we can clone, that would be awesome!
We're starting to see OOM issues on Vercel as well. It mostly started when we started migrating some pages to Next.js App Dir router.
this is impacting our customer and blocking a next.js migration. thanks for looking into this issue.
Really glad to have stumbled upon this issue, I've been pulling my hair out trying to figure out the cause of our OOM issues.
We're on next
version 14.1.3
. Just upgraded @sentry/nextjs
to the latest 7.107.0
and cleared the cache, but we are still seeing this.
Does anyone have a set of versions that fixes this issue by chance?
Removing withSentryConfig
from next.config.js
is the only way I was able to fix it.
Thank you!
As mentioned above, the workaround is to set disableServerWebpackPlugin: true
in your sentry config, you can still keep using withSentryConfig
to make sure everything else gets instrumented + client-side sourcemaps get uploaded.
// next.config.(js|mjs)
const nextConfig = {
sentry: {
disableServerWebpackPlugin: true,
},
};
This is because the memory issues come from a combo of Next.js + Vercel itself, specifically about generating server-side sourcemaps - the Sentry SDK itself has no control over this, but we require sourcemaps to be generated so we can upload them.
Do note, setting disableServerWebpackPlugin: true
means you don't get sourcemaps for your server-side errors, which will make their stacktraces worse in Sentry, but it at least means you don't lose visibility into that area of your stack.
That did work, thanks.
Hopefully we can re-enable that soon.
any update here @lforst ? :) curious if this has been resolved given the vercel / nextjs ticket you linked (https://github.com/vercel/next.js/pull/59569) says it's merged. Is there a more recent ticket I should look at on their end?
@mosnicholas No update from our side. I'll ping Vercel about whether this issue has been fixed. It's best for you to just try things out on the newest Next.js version. The PR I linked was the culprit, not the fix.
Checking if there is any update on this? Adding Sentry to our nextjs app is increasing the build time from 4 min to 20+ minutes! I've removed replay, adding tree shaking in the webpack config, and tried adding disableServerWebpackPlugin: true
into the sentry config. None of it seems to make a difference in the build time.
Here is my next.config.js
/* eslint-disable @typescript-eslint/no-var-requires */
const withBundleAnalyzer = require('@next/bundle-analyzer')({
enabled: process.env.ANALYZE === 'true'
});
const { withSentryConfig } = require('@sentry/nextjs');
const generateRedirects = require('./lib/redirects');
const headers = require('./headers');
/**
* @type {import('next').NextConfig}
* */
const isProd = process.env.NODE_ENV === 'production';
const nextConfig = withBundleAnalyzer({
async headers() {
return headers();
},
transpilePackages: [
'lodash-es',
'yoastseo',
'@yoast',
'lucide-react'
],
trailingSlash: true,
experimental: {
taint: true,
instrumentationHook: isProd
},
reactStrictMode: true,
webpack: (config, { webpack }) => {
config.plugins.push(
new webpack.DefinePlugin({
__SENTRY_DEBUG__: false,
__RRWEB_EXCLUDE_IFRAME__: true,
__RRWEB_EXCLUDE_SHADOW_DOM__: true,
__SENTRY_EXCLUDE_REPLAY_WORKER__: true
})
);
// return the modified config
return config;
},
eslint: {
dirs: ['components', 'layouts', 'lib', 'pages', 'styles'] // adding this because it was running Next lint rules inside /studio
}
});
module.exports = withSentryConfig(nextConfig, {
// For all available options, see:
// https://github.com/getsentry/sentry-webpack-plugin#options
org: '<client-org>',
project: 'site',
authToken: process.env.SENTRY_AUTH_TOKEN,
silent: true,
widenClientFileUpload: false,
hideSourceMaps: true,
disableLogger: true,
automaticVercelMonitors: true,
disableServerWebpackPlugin: true
});
@ericnation do you see these build slowdowns in local development as well?
disableServerWebpackPlugin
only affects the server-side sourcemaps uploading, there can still be client-side uploading. To test if the problem is with client-side sourcemaps uploading, can you remove withSentryConfig
, and enable generating client-side sourcemaps (by setting productionBrowserSourceMaps
to be true
in your next.config.js
)?
If you only see this issue in Vercel build, and the issue still persists when you set the productionBrowserSourceMaps
, the issue is with Vercel + how Next.js generates sourcemaps. Unfortunately you might have to set both disableClientWebpackPlugin
and disableServerWebpackPlugin
to disable sourcemaps generating + uploading for both client/server as a workaround.
@AbhiPrasad is there any workaround you might suggest for how we can upload source maps manually?
@mosnicholas you can upload source maps manually by following this guide: https://docs.sentry.io/platforms/javascript/sourcemaps/uploading/cli/
I think the memory consumption + long build time comes from the fact that source maps are generated. Source maps generation takes up a lot of memory in webpack. It should not be the uploading part that is expensive.
Hey!
I’m encountering a similar issue when uploading sourcemaps for my Next.js project to Sentry during the build process on CircleCI. This issue started after migrating our first page to the App Router architecture. The build hangs specifically when uploading the client sourcemap. The logs show:
[@sentry/nextjs - Node.js] Info: Successfully uploaded source maps to Sentry
[@sentry/nextjs - Edge] Info: Successfully uploaded source maps to Sentry
[@sentry/nextjs - Client] Info: Sending telemetry data on issues and performance to Sentry. To disable telemetry, set options.telemetry to false.
<--- Last few GCs --->
[134:0x6d0f190] 357870 ms: Scavenge (reduce) 2013.0 (2081.9) -> 2012.8 (2082.6) MB, 4.26 / 0.00 ms (average mu = 0.168, current mu = 0.050) allocation failure;
[134:0x6d0f190] 358684 ms: Mark-Compact (reduce) 2014.4 (2083.3) -> 2014.0 (2084.3) MB, 812.00 / 0.00 ms (average mu = 0.125, current mu = 0.063) allocation failure; scavenge might not succeed
It’s worth noting that without the App Router, the build works without any issues. Also, disabling sourcemaps upload allows the build to complete successfully.
Here’s the configuration we’re using in withSentryConfig:
withSentryConfig(nextConfig, {
silent: true,
disableLogger: true,
sourcemaps: {
disable: process.env.DISABLE_NEXT_SENTRY_SOURCEMAP_UPLOAD === 'true',
deleteSourcemapsAfterUpload: true,
},
})
Could manually uploading the sourcemaps be a viable workaround for this issue? If so, could you provide guidance or an example on how to upload all sourcemaps (edge, server, and client) for a Next.js project?
Thanks!
@giankotarola Hi, building source maps is unfortunately relatively memory intensive. We don't do anything funky in the SDK besides turning on source map generation. You might want to raise this with the Next.js team, ideally by also sharing a memory profile of the build.
@lforst Thank you for the quick response and the valuable insights! I didn’t initially realize that increasing the Node.js memory limit with the --max-old-space-size
option was necessary to handle the memory requirements for building the sourcemaps. After setting this option when running the Next.js build script to allocate more memory, the issue is resolved.
Thanks again for your help!
We tried to update from 7.99.0 to 8.25.0 and our github actions started failing (cancelled automatically after 30min+).
Disabling sourcemaps with
sourcemaps: {
disable: true,
deleteSourcemapsAfterUpload: true,
},
does fix the builds, but we had sourcemaps just fine on 7.99.0
@MonstraG Please try to provide a memory profile for when this happens.
I'm back with slightly more pieces of info:
a) When failing, here is the error reported by github:
The hosted runner encountered an error while running your job. (Error Type: Disconnect).
Received request to deprovision: The request was cancelled by the remote provider.
b) I've added catchpoint/workflow-telemetry-action@v2, and a successful (with sourcemaps disabled sourcemaps.disable: true
) build of 8.25.0 looks like this chart
c) We build 7 next.js applications at the same time.
d) Here is build config used in all seven apps:
const sentryConfig = {
widenClientFileUpload: true,
tunnelRoute: "/monitoring",
hideSourceMaps: true,
disableLogger: true,
silent: true,
sourcemaps: {
deleteSourcemapsAfterUpload: true
},
release: {
name: process.env.NEXT_PUBLIC_CI_COMMIT_SHORT_SHA ?? "Dev"
},
authToken: process.env.SENTRY_TOKEN,
org: "org",
project: "project"
};
e) I've attempted to give the failing job some time and then to cancel it (cancelled jobs still produce memory chart). The original no-cancel ran spanned 40 minutes, so I tried cancelling it 20 minutes after start, 10 minutes after start and 5 minutes after start. In 20-minute and 10-minute runs died with
Runner GitHub Actions 29 did not respond to a cancelation request with 00:05:00.
The 5-minute run produced this chart
f) For cancelled/failed jobs, raw github logs end with
Job is about to start running on the hosted runner: GitHub Actions 29 (hosted)
(meaning basically nothing gets logged)
g) sentry v7.99.0 with sourcemaps enabled gives this chart
h) sentry v7.99.0 with sourcemaps disabled (disableServerWebpackPlugin: true, disableClientWebpackPlugin: true
) gives this chart
i) These charts are not very enlightening, but we don't self-host builds, so I'm not sure how to capture anything more useful
We'll continue to look into this next week as this week is Hackweek at Sentry (#13421)
we're seeing a similar thing on our builds
disabling sourcemaps has resolved this for now
Same issue for us in a standard next build
(no Vercel). Disabling sourcemaps has indeed fixed the issue. Lmk if I can help in any way.
// next.config.js
module.exports = withSentryConfig(module.exports, {
sourcemaps: {
disable: true,
},
}
What made it work for us was updating package.json - i.e., the build
command under scripts
"build": "NODE_OPTIONS=\"--max-old-space-size=4096\" next build",
We did NOT disable sourcemaps.
for us even with the max 8GB (on non enterprise plan) vercel gives you for build containers, we were always OOMing at this point with sourcemaps turned on
Hey everyone, for now the best workaround still is to increase max-old-space-size
as pointed out in this thread. If this doesn't help, try disabling server source maps generation or (as a last resort) the entire source maps generation.
Other than that,
You might want to raise this with the Next.js team, ideally by also sharing a memory profile of the build.
still holds up. Looking at the provided screenshots of memory consumption from @MonstraG I don't see the SDK doing anything crazy in terms of memory consumption but rather slightly increasing already very high consumption.
Here, its not work with disable source maps. (no versel - we use Kaniko to build/deploy the image).
and... not work incrase "NODE_OPTIONS=\"--max-old-space-size=4096\"
@oalbsilva It would be cool if you could provide a memory profile. Thanks!
@lforst idk how to provide a memory profile from kaniko! Sorry i dont have permissions to do that!
But i run local on my dockerfile and get this results with next build --experimental-debug-memory-usage
Look how many memory uses incrase!!
Without Sentry:
Memory usage report:
- Total time spent in GC: 537.88ms
- Peak heap usage: 169.46 MB
- Peak RSS usage: 541.54 MB
With Sentry:
Memory usage report:
- Total time spent in GC: 2168.03ms
- Peak heap usage: 549.02 MB
- Peak RSS usage: 1455.73 MB
@oalbsilva Can you try disabling Sentry but enabling high-quality (!) source map generation, to see how much Sentry contributes to those stats? Thanks!
Can you try disabling Sentry but enabling high-quality (!) source map generation, to see how much Sentry contributes to those stats? Thanks!
Do you have a link or something that would tell me how to emable high-quality source maps?
The closest I found is https://nextjs.org/docs/app/api-reference/next-config-js/productionBrowserSourceMaps
@MonstraG You can do this in your next.config.js:
const nextConfig = {
webpack: (config, { webpack }) => {
config.devtool = 'source-map';
return config;
}
};
I've done next build -d --experimental-debug-memory-usage
of 1 of our biggest apps but with --trace-gc
on node:
run | description | Memory usage report in the end |
[sentry-sourcemaps-enabled.txt](https://github.com/user-attachments/files/16850129/sentry-sourcemaps-enabled.txt) | Sentry sourcemaps enabled, this would OOM on github (with other apps) | ``` - Total time spent in GC: 7427.61ms - Peak heap usage: 1377.95 MB - Peak RSS usage: 2974.53 MB ``` |
[sentry-sourcemaps-disabled.txt](https://github.com/user-attachments/files/16850132/sentry-sourcemaps-disabled.txt) | Sentry sourcemaps disabled | ``` - Total time spent in GC: 7983.11ms - Peak heap usage: 995.77 MB - Peak RSS usage: 2287.89 MB ``` |
[sentry-sourcemaps-disabled-but-productionBrowserSourceMaps.txt](https://github.com/user-attachments/files/16850130/sentry-sourcemaps-disabled-but-productionBrowserSourceMaps.txt) | Sentry sourcemaps disabled, but productionBrowserSourceMaps: true that I found earlier | ``` - Total time spent in GC: 7645.32ms - Peak heap usage: 1386.50 MB - Peak RSS usage: 2871.09 MB ``` |
[sentry-sourcemaps-disabled-but-devtool-source-map.txt](https://github.com/user-attachments/files/16850131/sentry-sourcemaps-disabled-but-devtool-source-map.txt) | Sentry sourcemaps disabled, but config.devtool = "source-map"; as requested | ``` - Total time spent in GC: 7278.33ms - Peak heap usage: 1438.60 MB - Peak RSS usage: 3038.79 MB ``` |
@MonstraG thanks for doing the research! To me the table indicates that the increased memory consumption is indeed related to source maps being turned on, and not the SDK being added per se? Or would you interpret it differently?
I cannot speak from any experience as this was the first time I used --trace-gc
(or actually any of the other options), but I would have to agree. From this it does seem that sourcemaps => x1.4 memory
.
I've decided to do two more runs with same debug flags:
run | description | Memory usage report in the end |
[sentry-v7.txt](https://github.com/user-attachments/files/16862101/sentry-v7.txt) | Sentry 7.119.0, everything is enabled as normal | ``` - Total time spent in GC: 5960.13ms - Peak heap usage: 933.65 MB - Peak RSS usage: 2403.26 MB ``` |
[remove-withSentryConfig.txt](https://github.com/user-attachments/files/16862102/remove-withSentryConfig.txt) | I removed the `withSentryConfig` completely from next.config.js | ``` - Total time spent in GC: 6164.80ms - Peak heap usage: 830.56 MB - Peak RSS usage: 2098.14 MB ``` |
Damn I might have forgot to enable debug in sentry v7
Btw, node version is v20.17.0
for all runs.
@MonstraG Thanks again for doing the research.
From looking at this data, and also data that other folks provided it seems to me that the additional memory consumption is not excessive but we are hitting an unfortunate threshold that seems to be pushing it for various build pipelines.
I cannot think of a fix for this right now, except for disabling sourcemaps, but that will massively degrade your experience when using Sentry for error monitoring. Alternatively, try to increase your memory limit. As far as I am aware, and from when I talked to them last, Vercel does already know that sourcemaps consume too much memory and are looking for ways to improve this. From our end we can just hope that things improve upstream, or when there doesn't seem to be any movement we can contribute too.
Our team is running into this same issue, had to disable sourcemaps which is obviously not ideal in terms of Sentry experience. Tried reducing our dependencies(removing unused/duplicates) to free up some memory, but 90% of our builds are still failing due to memory constraints.
Exploring using turbo cache atm so we don't have to build on Vercel, we'll see if that helps resolve the issues 🤞 But ideally Sentry/Vercel would somehow resolve this.
@benmarg We cannot resolve this from our end. Please raise this with the Next.js team!
I was running into this issue too. Using
module.exports = withSentryConfig(nextConfig, {
...
sourcemaps: {
deleteSourcemapsAfterUpload: true
},
...
})
solved the problem and I still get to use sourcemaps within Sentry. Also, my build command is at the max NODE_OPTIONS=--max-old-space-size=8192 next build
.
I was also having this issue. When running next build
, it would hang on "Linting and checking validity of types ...".
I tried:
sourcemaps: {
disable: true,
},
However it didn't work.
The only solution that worked was to upgrade my build server from 4GB to 8GB unfortunately.
Environment
SaaS (https://sentry.io/)
What are you trying to accomplish?
I'm trying to build my Next.js project
How are you getting stuck?
The build fails with "Out of Memory" ("OOM") event if sentry is enabled. If I remove sentry from my code, the next build works.
Where in the product are you?
Bug Report
Link
No response
DSN
No response
Version
No response