Open ChristopherPHolder opened 5 months ago
Are you encountering a Core Web Vitals regression in a production deployed application? If so, can you please provide the before and after metric values? Measurements should ideally be taken from production or production-equivalent deployments. Server and/or hosting configuration can have a significant effect on the metrics. The development server, especially, should not be used to evaluate loading performance as it is designed to optimize the edit/refresh cycle and not initial load.
Also, if possible, please consider using the latest version of Angular (currently 18.0). Earlier versions can potentially inject a large amount of modulepreload
link elements into the generated index HTML especially in SSR cases which can lead to performance regressions. The fix for this may be back-ported. However, preload generation can be disabled via configuration in these cases. Also, there are numerous improvements in other areas from which the application may benefit by upgrading.
Regarding HTTP/2, any browser/device that does not have HTTP/2 will also not support running an Angular application that uses a supported major version of Angular as well as many unsupported versions. Regardless of bundling strategy, HTTP/2 is an effective means to improving CWV metrics.
As to barrel file usage, these have long-term been a source of problems when attempting to optimizing code especially when interoperating correctly with potentially side effectful modules and/or top-level await usage. Additionally, depending on what version of Angular was in use previously for the application, the AOT compiler itself may be involved as well. Older compiler versions rewrote component imports directly to the component implementation bypassing any intermediate re-exports. This has since been removed with one of the reasons being execution correctness. However, we are planning on some potential improvements in this area for the 18.x timeframe. Barrel file usage also appears to be the main underlying cause for the issue as described above so such improvements may be an effective mitigation in this case.
Are you encountering a Core Web Vitals regression in a production deployed application?
Yes, the regression is measured on a production environment. With Crux and other RUM data.
If so, can you please provide the before and after metric values?
In the worse case we have seems an approximate ~2 second regression on LCP. But please take into account that this is a large application with a lot of moving pieces. There are other factors there that are probably contributing to those numbers.
Are other large applications experiencing these regressions?
The minimal reproduction i have build illustrates the issue but a that scale the impact will be hidden by variability.
If its a question of proving the performance degradation I can scale the minimal reproduction, deploy it and run performance test on it.
What i would do is:
I would be able to scale the minimal reproduction by increasing the bundle size of each component with random strings and
please consider using the latest version of Angular (currently 18.0)
I agree, using the latest supported version when possible is best.
Angular v18 was release 6 days ago on 2024-05-20.
The demo is using v17.3 and the real app is on v17.1.
Currently v18 is not on the list of Actively supported versions
Regarding HTTP/2
We are using HTTP/3
So i just wanted to mention that we are aware of the fact that if we where using HTTP/1 this change would have a massive negative impact on LCP. But we are indeed using HTTP/3
As to barrel file usage
I am generally not a fan of barrel files. As mentioned in the issue, by doing that I might be removing some of the esbuild compilers ability to tree shake my code effectively.
Barrel file usage also appears to be the main underlying cause for the issue as described above so such improvements may be an effective mitigation in this case.
I am not sure about that. The idea of strategically placing barrels files to optimize how the bundler chunks code seems off to me. It seems like it will be prone to errors and may produce harder to maintain code.
That all being said using libs will produce barrel files, and so i do see how they can be necessary for some usages. But I don't think this is a valid solution for large scale applications.
Barrel file usage also appears to be the main underlying cause for the issue
I am using barrel files to opt out of the optimization here, i would say this is my hacky attempt to understand how this can be fixed.
I would say the issue is the way esbuild currently handles code splitting and more specifically this statement:
Note that the target of each dynamic import() expression is considered an additional entry point.
I have opened an issue on esbuild itself to discusses this and see if there is a potential solution there.
Some people tried to get a minChunkSize option into esbuild, with no success yet, though. You could write a super crude post processor that merges all <1kb chunks and rewrites the import statements to use that chunk instead. Or do a final pass with rollup (which got an experimentalMinChunkSize
option) in the meantime.
Could you try from inside your <app-name>/browser
directory:
npm i -g rollup
rollup --format es -d dist/ --experimentalMinChunkSize 1000 -- main-25ISSONF.js
with main-25ISSONF.js
being your entry chunk? And measure how the output from rollup performs for your app? For our build it removes ~60% of the chunks and is even quite fast in doing so. The rebundled app runs perfectly fine.
From digging a bit deeper, it seems like vite only uses esbuild for the dev mode, and uses rollup instead for the production build. A handful of people even go so far that they right now develop rolldown
which is supposed to superseed esbuild+rollup for vite, so they have a single build tool that works for dev & prod. So if you invest time now to have a "post optimization step" via rollup, you may be able to swap it for rolldown in a year to get most of the added build time overhead back.
@sod the solution is nice and it allows me to use a custom chunking strategy. But the issue i am facing when rebundling with rollup is that i loose the sourcemaps.
if rollup produces sourcemaps it will produce them based on the chunks as an input which makes it not useful.
Do you know if there is a way i can tell rollup to merge the original sourcemaps instead of producing a completely new one?
try https://github.com/maxdavidson/rollup-plugin-sourcemaps - can't confirm if it works, we don't use sourcemaps in production.
@sod can you please provide an example of how you made the rollup command work in combination with the sourcemaps plugin?
I tried rollup --format es -d ./ -- main-2ZWZBTDC.js -p rollup-plugin-sourcemaps
but didn't work for me. Obviously I'm doing something wrong.
We don't use sourcemaps in production.
I see, currently we are more or less blocked in my team and are not able to the application builder, since as you can see in the attached screenshot the esbuild version generates way too many JS requests, which cause a significant performance downgrade.
The rollup command that @sod kindly wrote in his comment removes ~60% of the chunks which is great 🎉 but it causes errors like this to our app in random places:
Error: no ngModuleRef constructor found
at chunk-UKEZH6HP-Djqkv2-u.js:246:4618
at Generator.next (<anonymous>)
at n (chunk-UKEZH6HP-Djqkv2-u.js:1:1248)
at k.invoke (zone.js:368:26)
at Object.onInvoke (chunk-UKEZH6HP-Djqkv2-u.js:9:52548)
at k.invoke (zone.js:367:52)
at se.run (zone.js:130:43)
at zone.js:1260:36
at k.invokeTask (zone.js:403:31)
at Object.onInvokeTask (chunk-UKEZH6HP-Djqkv2-u.js:9:52359)
at k.invokeTask (zone.js:402:36)
at se.runTask (zone.js:174:47)
at E (zone.js:582:35)
..which basically prevent some of our components from being loaded and makes some of its features unusable. I thought that possibly modifying the command to utilise the sourcemaps plugin might have some positive impact, but maybe I was just too desperate. Has anyone encountered a similar issue?
@dimeloper I suppose you issue is related to circular dependencies created by rollup. This is because of you use minChunkSize
which is still experimental.
The issue with preventing these circular dependencies is not trivial. And i have had partial success on applications with 200 or so initial bundles but the solution was flacky.
However achiving 1 initial bundle using manualChunks does work fine.
I am still in the process of publishing it but you can copy the executor into you project and it should work. Git Repo
If you have any issues with that solution let me know, I only have half or so of it in that repo.
Also if any one has a suggestion about how to balance the nodes without causing circular dependencies.
My initial attempt was solving it like a bin packing problem with a Greedy algorithm. But now knowing that it requires considering the link between the nodes i am not sure what solution appropriate.
In version 18.1, we introduced an experimental chunk optimizer. Has anyone tried it? If so, what were the results?
For more context, please refer to this https://github.com/angular/angular-cli/pull/27953
Hey @alan-agius4 I had tried that optimiser in my case, feel free to check out the full thread in the Angular discord channel for more details: https://discord.com/channels/748677963142135818/1260214060830556211/1260535860131270691
Long story short, the results weren't really helpful in my case, I had to rework my app and fine tune the rollup post build script to achieve a sensible amount of chunks (we still have the rollup step in our pipelines).
@dimeloper, you mentioned that the environment variables are being ignored. I'm not familiar with how env-cmd
functions, but typically, environment variables need to have their values defined as strings.
@alan-agius4 ok but I tried it like NG_BUILD_OPTIMIZE_CHUNKS=1
and like NG_BUILD_OPTIMIZE_CHUNKS="1"
- how should I declare it instead, so as to to see it working? And is there a log message or something during the build process that I can see and confirm that the optimiser was enabled?
If you set both NG_BUILD_DEBUG_PERF
and NG_BUILD_OPTIMIZE_CHUNKS
to "1"
or "true"
, the console will log the duration of the optimizations.
Note: To avoid overlapping logs, you may need to disable progress logging using ng build --no-progress
.
If you don't see any changes in the chunk sizes, we'll need a reproduction to investigate further.
The NG_BUILD_OPTIMIZE_CHUNKS=1
is very nice and works as advertised. I'm probably going to use it, when we switch.
We are still stuck on webpack though, as I can't get the esbuild variant to perform as good. Webpack has some incredible features like:
window.
, so not big deal, but:() => import(/* webpackMode: "eager" */'./my-dependency')
pattern from webpack is quite powerful, as with this you could force it to be included in the initial chunk, but not yet executed. This reduces the burden on the browser as it doesn't have to execute parts of the app initially, but the code will be ready asap when needed without loading extra chunks.(2) and (3) you kind of solved by partial hydration, but that is not yet supported without zone.js, so we are sticking to webpack right now.
@alan-agius4 I tried it once again using your debugging hint. This is the output I am getting having the optimiser enabled:
DURATION[NG_READ_CONFIG]: 0.039898750s
DURATION[NG_READ_CONFIG]: 0.055066417s
DURATION[NG_CREATE_PROGRAM]: 4.416569292s
DURATION[NG_ANALYZE_PROGRAM]: 5.477037292s
DURATION[NG_FIND_AFFECTED]: 0.005789167s
DURATION[NG_EMIT_TS]: 12.525402208s
DURATION[NG_DIAGNOSTICS_TOTAL]: 2.308133416s
DURATION[NG_EMIT_JS*]: 822.466522754s [count: 1875; avg: 0.438648812s; min: 0.003237000s; max: 1.184354959s]
DURATION[NG_EMIT_JS*]: 822.466522754s [count: 1875; avg: 0.438648812s; min: 0.003237000s; max: 1.184354959s]
chunk-UDIDPQ7Z.js (1:49305): Use of eval in "chunk-UDIDPQ7Z.js" is strongly discouraged as it poses security risks and may cause issues with minification.
DURATION[OPTIMIZE_CHUNKS]: 7.984619542s
As seen in the last line, it looks like the optimiser is enabled.
However the amount of chunks remains unchanged (dist folder contains the normal build output and dist-optimized contains the one using the optimiser):
Before switching to the application builder and ESBuild we used to have 326 chunks in total....
@dimeloper, unfortunately without a reproduction it’s hard to tell why the chunks count remained the same.
@dimeloper Ran a quick test on using the setup for the environmental variable you provided and i see it did not work. The issue that its not present for the angular builder or at least not present at the right time.
But there is an easy fix, instead of providing it to the executor using the define options simply create a .env file and set the env there.
Command
build
Is this a regression?
The previous version in which this bug was not present was
No response
Description
Disclaimer
This is not really a bug, as even if it worked differently with
webpack
, foresbuild
this is normal. However, since the default bundler is nowesbuild
some would consider this a regression.Required chunks references in dynamic imported code get split in to separate chunks
When splitting code, esbuild does not seems to properly take into account what is already statically imported (thus "required") and does not need to be split into a separate chunk.
To illustrate this lets consider an example where we have the following 5 standalone components:
AppComponent
(The root component which is not lazy loaded)Feature1Component
(A feature component lazy loaded via a dynamic import in the router)Feature21Component
(A feature component lazy loaded via a dynamic import in the router)Ui1Component
(A UI component imported statically, like a button)Ui2Component
(A UI component imported statically, like a button)Chunks without Shared Components
If the
AppComponent
has aRouterOutlet
which has separate pathsFeature1Component
andFeature21Component
it will generate a chunk for the code shared between these components (lets call it Common Chunk).In this case the build output is:
And if
Feature1Component
importsUi1Component
andFeature21Component
importsUi2Component
then code fromUi1Component
will be bundle inFeature1Component
and the code fromUi2Component
will be bundle inFeature2Component
.Chunks with Shared Components
However, if the
AppComponent
also importsUi1Component
andUi2Component
this code is now shared and will be split into separate chunks.Thus the build output will now be:
The issue is that
Ui1Component
andUi2Component
are now statically imported in theAppComponent
, and therefore belong in this common chunk. These chunks are statically imported in the "common" chunk and required for the application to bootstrap but are downloaded separately.As they are always downloaded, splitting them does not seems to have a benefit, when
Feature1Component
orFeature2Component
imports them, the request is cached and it should not make a difference if it over imports.However, splitting them does have a downside, it increases the load time. Even tho in this example the consequences of a couple of extra chunks is meaningless in larger apps the consequence is very real.
If the application produces 100 or 200 initial chunks this will have a significant impact on the initial load time and LCP.
Even tho the a modern browser using http3 or http2 uses multiplexing it will still have an significant overhead downloading so many small chunks. We are currently calling this
the chunk gap
:Additionally this issue will have a larger impact on older and slower devices.
Additional example of performance impact:
Avoiding Code Splitting Shared Code
It seems possible to avoid additional chunking in some cases by using barrel files.
For example if we create a barrel file that exports both
Ui1Component
andUi2Component
and only import them via that barrel file we are able to forceesbuild
to place the shared code into the "common" chunk.So using:
The build output will now be:
Note
In
esbuild
architecture documentation it explains that:This means that doing so will partially optout of tree shaking.
For example if we add a
Ui3Component
and aFeature3Component
.And we add the
Ui3Component
to the barrel file:If the
Ui3Component
is never imported anywhere it will not increase the initial bundle.However, if we reference the
Ui3Component
inFeature3Component
(loading theFeature3Component
in a lazy route), this will increase the initial bundle even ifUi3Component
is not used in theAppComponent
.Minimal Reproduction
https://github.com/ChristopherPHolder/ng-esbuild-demo
Exception or Error
Your Environment
Anything else relevant?
No response