GoogleChromeLabs / worker-plugin

👩‍🏭 Adds native Web Worker bundling support to Webpack.
https://npm.im/worker-plugin
Apache License 2.0
1.91k stars 79 forks source link

splitChunks plugin does not work with worker-plugin, bundles are huge and rebuilding is slow #48

Open eranimo opened 4 years ago

eranimo commented 4 years ago

It appears that vendor bundles are only extracted using splitChunks in the main code and not the worker code when using this plugin. Adding the splitChunks plugin manually did not solve the issue.

developit commented 4 years ago

Hi @eranimo. This does sound like a painful experience. Do you happen to have a way to create a minimal reproduction? My inclination would be to drop WorkerPlugin into a large preexisting Webpack config like Next.js, but I'm not sure if that will produce the issues you outlined without having a relatively large application under test.

FWIW I don't think Webpack is capable of code-splitting across Worker boundaries. Doing that would require Webpack having an understanding of the trade-offs associated with grouping Worker dependencies with Main Thread ones, since that risks loading Worker-only code on the main thread just to meet splitChunks' configured threshold values. It also would seem to require an additional argument to the name() config function for which context a dependency/module is being loaded from, since sharing dependencies between workers runs the same risks as sharing with the main thread.

This is all potentially solvable, but it makes me think the best way forward here would be to allow configuring splitChunks via WorkerPlugin, or attempting to "inherit" configuration from the main compilation (essentially extending webpack's config format for the purposes of this plugin).

Regarding rebuilding speeds, that's a little odd to me. With this plugin you're certainly paying the cost of a second compilation instance and second parsing pass, but for modules shared between main thread and worker there should not be any additional loader execution. This makes me wonder if perhaps you have loaders in your config that are not marked as cacheable, which would certainly cause inflated build times (potentially double).