Closed CGarces closed 3 months ago
run it with the --debug flag
Thanks @blinkdaffer
--debug
not add more details
> next build "--debug"
2024-03-03T16:43:13.966Z [INFO]: ⚠ No build cache found. Please configure build caching for faster rebuilds. Read more: https://nextjs.org/docs/messages/no-cache
2024-03-03T16:43:13.971Z [INFO]: Attention: Next.js now collects completely anonymous telemetry regarding usage.
2024-03-03T16:43:13.971Z [INFO]: This information is used to shape Next.js' roadmap and prioritize features.
You can learn more, including how to opt-out if you'd not like to participate in this anonymous program, by visiting the following URL:
https://nextjs.org/telemetry
2024-03-03T16:43:14.046Z [INFO]: ▲ Next.js 14.1.0
2024-03-03T16:43:14.046Z [INFO]:
2024-03-03T16:43:14.111Z [INFO]: Creating an optimized production build ...
2024-03-03T16:43:44.251Z [INFO]: ✓ Compiled successfully
2024-03-03T16:43:44.256Z [INFO]: Linting and checking validity of types ...
2024-03-03T16:43:52.278Z [INFO]: Collecting page data ...
2024-03-03T16:43:53.523Z [INFO]: Generating static pages (0/16) ...
2024-03-03T16:43:53.795Z [INFO]: Generating static pages (4/16)
2024-03-03T16:43:55.680Z [INFO]: Generating static pages (8/16)
2024-03-03T16:43:55.921Z [INFO]: Generating static pages (12/16)
2024-03-03T16:44:02.943Z [INFO]: ✓ Generating static pages (16/16)
2024-03-03T16:44:03.133Z [INFO]: Finalizing page optimization ...
Collecting build traces ...
2024-03-03T16:44:03.348Z [INFO]: Redirects
2024-03-03T16:44:03.349Z [INFO]: ┌ source: /:path+/
├ destination: /:path+
└ permanent: true
2024-03-03T16:44:09.528Z [INFO]:
The build is set as "failed" without more information
Just to add more context.
@CGarces 👋 , thanks for reaching out. Can you share your Amplify app id?
d3qvt5uw9k572 builds form 1 to 12, all failed without errors.
@CGarces thanks for sharing that information. While reviewing the internal logs for build 12, I observed that it failed due to running out of memory.
Since the default build container has 7GB RAM, it should be able to support about 5.5GB max old space size assuming no other build process is running in parallel. Setting the max-old-space-size to >5.5GB for large builds would result in the OS killing the node process.
Thus, you can set a max old space size of 5.5GB in the buildSpec as follows:
export NODE_OPTIONS=--max-old-space-size=5632\n'
Additionally, you can verify whether NodeJS has correctly picked up the max-old-space-size config by including the following command at the preBuild step in the buildSpec:
node -e 'console.log(v8.getHeapStatistics().total_available_size/(1024*1024))'
Lastly, we are tracking the feature request to increase the compute capacity of the build containers/environments here: https://github.com/aws-amplify/amplify-hosting/issues/654
Can you download the logs for build 12 from the console and confirm if you are able to observe the OOM error within them? This will help us gauge if the appropriate error is surfaced in the logs.
These docs exist for troubleshooting memory issues: https://docs.aws.amazon.com/amplify/latest/userguide/troubleshooting-ssr-deployment.html#out-of-memory
Can you download the logs for build 12 from the console and confirm if you are able to observe the OOM error within them?
No, I can't observe the error. That's the point of this issue. I already suspect that was a OOM error (it's at the end of the 1º post) The problem is that the error not is show on the logs or the AWS Console interface.
Marking this bug for error messaging and visibility improvements.
Hi @CGarces 👋 , we have deployed a fix for this issue and with this change the builds should no longer fail without surfacing the appropriate OOM error/banner in the console. I am going to close this thread but please feel free to reopen it if you are still running into it. Thanks!
Environment information
Description
The amplify build stage fails without any error in the logs, so the deploy stage not is executed.
Last lines of the logs
Maybe it's and out of memory error, not shown in the new interface?