TrilonIO / aspnetcore-angular-universal

ASP.NET Core & Angular Universal advanced starter - PWA w/ server-side rendering for SEO, Bootstrap, i18n internationalization, TypeScript, unit testing, WebAPI REST setup, SignalR, Swagger docs, and more! By @TrilonIO
https://www.trilon.io
MIT License
1.46k stars 433 forks source link

JavaScriptServices template using angular v4 and have vendor spliting #316

Closed consigliory closed 7 years ago

consigliory commented 7 years ago

it used to be cutting edge template with latest and greatest but I recently found that JavaScriptServices have vendor spliting and using angular version 4+. I know it is silly question but I am not following all discussions and just want to ask is there any special reason why JavaScriptServices webpack config cannot be used to get vendor split? Or there is issues with lazy load components?

MarkPieszak commented 7 years ago

@consigliory Hey Roman, apologies. So I spoke with Steve last week about a few things about JSServices & this repo. At some point in the future they will be a little more aligned, with this repo having much more aspnet & angular additions to it.

I'll add in a few things to speed up those HMR rebuilds, it was much faster previously, I'll try to add in the speed we had before (when we were in 2.x). ngtools/webpack makes things a little bit tougher.

Sorry for the long wait!

TommyLCox commented 7 years ago

It is definitely tougher than it looks at first glance. I worked on it for a day by trying to model the web pack vendor file in JavaScriptServices but before JavascriptServices was using angular version 4+ as @consigliory said above. The main problem I had was when I took node_modules out of the main bundle, I couldn't seem to get the project to work again. I used DLLRef and tried to bundle all of the vendor code much like JavaScriptServices.

After speaking with Steve and you guys already having a direction in mind, does that include or not include looking at what JavaScriptServices is doing today? In other words, I am wondering if that would be worth my time to try again to start there with JavaScriptServices, and see if I can speed up my HMR rebuilds? Or, do you have a whole different approach in mind? (The best way to Webpack all of this isn't "intuitively obvious to the most casual observer".)

MarkPieszak commented 7 years ago

Unfortunately yeah there really is no overly easy way to do it, as there are a few limitations in place. The problem is DllPlugin doesn't play well with AoTplugin, so it won't accept vendor chunks.

This config here isn't much different than JavaScriptServices, it's main focus was just to bring AoT into the picture, I've been working on reverse engineering it the other way, and so far the only decent result is using ATL for dev builds, AoTPlugin for prod (with no vendor chunking there). This seems rather messy though.

Other things I've been trying are combinations of AutoDllPlugin, ng-Router-loader, and Ng can-webpack.

Spent an ungodly amount of the time the past 2 days trying some of these combos with mixed results. Will probably get something in there soon for the time being.

TommyLCox commented 7 years ago

Thank you explanation and especially for your efforts. I thought I was going to find the answer to this; but, there is consolation in the fact that it isn't an easy thing to do considering the newness of some of these technologies. It helps me to know that it wasn't just me.

The great thing about this overall project is that the two sites I have built have such outstanding performance both running locally and deployed to Azure. They are blazing fast. Looking forward to just speeding up the dev process a bit!

JohnGalt1717 commented 7 years ago

It's much worse than just AOT not working. Tree shaking is also broken even if you got AOT working. And product builds with AOT take upwards of 20+ minutes on any sufficiently complex website (which there are multiple bug reports in the Angular repo on)

If Tree shaking worked, I would simply have a separate production webpack config that didn't do vendor splitting and rolled it all into one. But since it is broken there's little point because AOT doesn't buy you much.

All of which are known issues with the angular and webpack teams and nothing has been done for over 6 months to fix these basic show stopper bugs in Angular. (and this has nothing to do with .net, it's in all versions of angular no matter what the web host is that it runs on.)

This is why JavaScriptServices doesn't do AOT and tree shaking: It's broken. If these major show stopper bugs were fixed then JavaScriptServices would have it in a few minutes and we wouldn't need this repo at this point.

Someone in the angular team needs to have their heads rattled and stop worrying about new features and actually fix these basic issues.

consigliory commented 7 years ago

@MarkPieszak @JohnGalt1717 Does it even worth to have aot if there are such issues with vendor split. Before we aot with vendor split my app was tiny file now I am forced to use lazy loading and main entry file is still huge 6,2 MB!!! insane size for the file which not even have an entire app in it just first bit.

JohnGalt1717 commented 7 years ago

Vendor split can be done in dev mode and AOT in production mode. I have it setup this way right now, but there are issues with AOT running out of memory (using more than 2 gigs of memory to build!) that @MarkPieszak is looking into and there are a ton of bug reports on the main angular repo.

Further by splitting the vendor.css and not the .js rendering looks better which I have also done.

Also I discovered the PurifyPlugin stuff and got another 25% decrease in the size of our main.browser.js bringing it down to about 2 megs uncompressed which is still ridiculous but includes all of the vendor stuff too in production. (it still throws errors however so I can't use it yet)

Perfect world all of this stuff would work with AOT and the new CSS splitting with chunks in angular would also work so that when you create a chunk it would not only split the js but also the .css file and load it separately which would be huge for main.browser.js because then the css would all render before the page is displayed making it look way better and then each chunk would load 2 files in parallel speeding things up.

But there are about 15 bugs between there and here.

LiverpoolOwen commented 7 years ago

By the sounds of this conversation it's going to be a long time till these issues are resolved. I do like angular but performance is the main reason I was going to be using this template. It's no fault of this template but might we be better using react or another framework to get better performance?

JohnGalt1717 commented 7 years ago

The performance of angular when fully setup is not a problem. It's the development and build process that is a horrible mess. React has it's own mess that is no better and you lose nice binding and you gain the ugly jsx/tsx format inside what should be your code behind. While at this point it's really the only other viable alternative I wouldn't call it much of one.

  1. Webpack is a disaster for any project of any significant size. Because it can't do incremental builds it's constantly recompiling everything. Hence you have to do vendor splitting etc. instead of doing what compilers have done for decades and incrementally building. HMR makes this worse not better because it's constantly recompiling essentially everything in the webpack every time you make a change.

  2. Webpack and typescript and the entire linker/compiler process is broken and will run out of memory because there isn't a real linker so it's trying to hold the entire resultset of a project in memory in node which causes node to crash when you add AOT and tree shaking on any significant project because of node's 2 gig limit (even 64 bit versions)

  3. Webpack doesn't actually buy you much especially in the world of HTTP/2. It's objective is to limit the number of files requested, but that isn't a significant bottle neck in performance most of the time and can actually make things worse by increasing the first page to load times by loading a ton of stuff that isn't necessary for that page because it's bundled and the page doesn't need it. It would be FAR better if we had a process that created js bits. I.e. it took your vendor files, tree shook them and generated a stripped individual js file that was deployed with just what was needed. Then the module for the current page could load what it needs and so could the page.

  4. Angular needs to get over the whole "load the entire thing in the module just because one page in the module needs it" thing, and instead only load in the module what is required for the module itself to load and route to a page that will then load what it needs if it isn't already in memory. This would completely minimize what is loaded and every page should have it's own .js, css, and optionally html file. With http/2 there is no reason not to use this because the overhead is minimal and because so little would be loaded it would greatly outweigh http/2's overhead per request which is almost 0.

  5. Angular should then opportunistically load any page that is linked to from the current page after the current page is completed.

  6. A huge number of controls etc. don't work properly with server side pre-rendering and the whole hand-off process is flawed. (and cookies don't work properly SSR right now so handling users logged in etc. doesn't work well.) This could be fixed by getting people that write the controls to actually test for it and stop accessing the damn DOM directly, but good luck with that.

By doing all of these things you wouldn't need to have modules that are lazy loaded except for in extreme cases, because everything would be lazy loaded. (like you could hack angular 1 to do).

Basically, while Angular uses typescript, Google needs to learn from Microsoft and it's excellent compiler technology too and start stealing from it. Webpack is a bad idea who's time is over. If you're not using an http/2 compatible server now, you soon will be and every browser that matters supports it so the entire concept is flawed.

Meanwhile Systemjs is being depreciated which was a much-less-bad approach although still not what it should be.

I'm hoping that Blazor turns into something real because then we can use MVC Pages for code behind like our ts components but write C# code that builds into a module. If Sanderson gets the binding system right this could be epic.

LiverpoolOwen commented 7 years ago

Hmmm that was an eye opener I will do some more research into these technologies you mentioned. Thanks for the info @JohnGalt1717. I am just trying to find the ideal stack to work with for the best user experience and performance on new projects but it is so complicated with all of the newer tech coming out lately.

JohnGalt1717 commented 7 years ago

From what I can tell there is no such thing and I've researched all of them. The closest is Aurelia, but it's looking more and more like a dead duck now that the leader of it is working at MS.

And if you use the older versions of stuff like Knockout, Durandal or Angular 1 you're dooming yourself to obsolescence and a rewrite in about 2 years or less. (and if you go with MVC.net or ASP.NET you're losing a ton of features and making really great interactive websites incredibly hard to build)

So you're stuck with a mess which is why it's so annoying. React is a crazy rendering system with a ton of bolt ons and all of the same problems with webpack as Angular. Angular is an over-architected system based on a false premise of bundling to start with. Vue.js is Angular 1 done without polling but most of the setups you'll find are again using webpack.

One interesting avenue is fuse.js which is basically webpack but seems to be a lot less bad. (it still isn't good and still doesn't do things the way that would make sense to do them but it's less bad) But of course then you're on the fringes playing with tech that could go away at any time.

Which is why I hope blazor takes off and MS adopts it in house. Mainstream C# webassembly-based SPA that starts out using MVC.net's routing engine, code behind and page structure in MVC style, but built using a proper modern linker and compiler. (Roslyn) But that's going to take 6-8 months even if MS jumps on board fully so in the short term you're screwed.

LiverpoolOwen commented 7 years ago

Yeah I was thinking maybe I should just go back to KO and do things the old school and not have to worry about build systems but it will make things harder to maintain and projects may need rewriting in the near future. For now, I am going to use the JS Services template as its easier for dev and just hope they sort the webpack issues out in the next few months.

MarkPieszak commented 7 years ago

@LiverpoolOwen yes JSServices would be fine for now, the goal of this repo now will be to extend that one as maintaining two separate repos is a pain, and I unfortunately don't have the bandwidth I used to, to work on everything separately. Ideally I want this to be an extension of JSServices, showcasing some missing and additional features, and maybe even some other good improvements you could add to your app. We fixed the webpack setup there for development (something I once had here), but I never had time to go back and add it in.

As John suggested, I don't know if I'd recommend going to older frameworks like jQuery etc, unless you are just extremely comfortable & quick developing with them, and SSR / bundle size / etc don't matter to you. It all depends on what your goals are for the project.

SSR does make things quite difficult because isomorphic javascript is a relatively new frontier, a lot of the concepts of transfering, reusing data / keys / cache, are all new to most people, and definitely confusing. Really make sure your project needs SSR before going down that road, because you really have to have a good grasp on how to properly inject libraries/etc to only render on the browser etc.

On a more positive note regarding SSR, we are slowly in the works on creating a type of proxy around window/etc, so (hopefully sometime soon), Angular's platform-server will be able to handle most (but probably not all) libraries that have extensive use of the window/document/etc, without having to worry about dependency inject them. So libraries like jQuery, etc, will hopefully "just work", as the server won't completely break when it sees them. This will obviously slow performance slightly, so like always, it's a pros/cons type of decision that needs to be made whether you'd want to do that.

@JohnGalt1717 I still need to look at your repo John apologies the delay, I haven't heard of anyone having memory issues etc, there might just be a memory leak in your app (or circular dependency).

As for webpack it's anything but dead, as it continuously gains sponsorship from many large companies. Itll improve more in time, as will angular. The benefit angular has is that it's truly a compiler, so it can change and adjust with the times. Things take time, these times are all brilliant and dedicated, things can't happen overnight ๐Ÿ˜’

And even with the world of http2 you'll still need a bundled like webpack, because you need it to handle the injector and dependency tree, not to mention compilation and everything else.

I would agree about other frameworks and options, as every one is made for different needs. There is no, winner takes all as everything is a give or take.


The other option is also just using the CLI, and keep your dotnet to be your rest API only. That makes things significantly easier as well. (We added experimental support for universal in the CLI now as well, just a small node server)


I promise I'll spend at least 10-20 hours this week cleaning everything up this week, there's never enough time in life ๐Ÿ˜ข

LiverpoolOwen commented 7 years ago

I know how you feel there mate, I have been hitting the overtime as we are closer to releasing in work so I have not had any free time๐Ÿ‘Ž. It is unfortunate as I want to keep on contributing to this project.

In regards to using older frameworks, I have got a few projects in mind but am more just trying to get the best understanding of this tech stack before I make a choice on what I want to be using really.

Some of the issues @JohnGalt1717 has mentioned are new to me as I am not clued up on the details behind AOT, Treeshaking tbh. They seem to be major issues which are going to affect the success of this and the js services templates for now, as you said it's not an over night thing I guess.

If there is anything I can help with let me know.

Thanks for your time @MarkPieszak and @JohnGalt1717.

JohnGalt1717 commented 7 years ago

@MarkPieszak We're 100% behind you and we feel for you. The problem is it appears that the people working on webpack and Angular missed the past ~45 years of programming development and are reinventing the wheel for 0 benefit and in the process putting us back decades because of that failure to understand the problems that long ago were solved:

  1. In Angular 1 with Typescript we had a poor man's version of incremental builds: Compile on Save. Save the file, it recompiles that file. The CLI will sort of do this but isn't really designed for it. and the CLI takes upwards of 4 minutes for my basic solution to rebuild the entire thing. Webpack makes this exponentially worse.

Because Webpack has no notion of incremental compiles nor any of the tools to do it correctly everything is a complete rebuild of whatever Webpack.config.s file is in the scope of the file change. The entire notion of vendor.js is entirely to work around webpack's complete lack of understanding of basic compilers that have been around for decades. If Webpack were msbuild, it would look for changes and use the obj files to compile just the changeset and do so instantly because none of the vendor stuff would have changed, and when the vendor stuff did change it would compile that and all of the decencies of that specific vendor module. Webpack is in the dark ages and can do no such thing. The CLI can't do that either. So we have a hack to fix Webpack's fundamental flaw that causes almost all of the pain.

Fix this, and you fix the vast majority of the problems that Angular has from a pure development perspective. (not a release perspective however)

  1. CLI/Webpack does not have a linker. If you're using C++, or Java/Scala, .NET or even VB6 (or 5,4,3) you can link files. That is you can create discrete projects that can link all of the code required for that set of code to run. And by all of that could I mean just the exact code required from all dependencies for your code to execute. While there are exceptions (reflection where you have to explicitly tell the linker that you use it) this just works and results in TINY single file executables if you want etc. This is what tree shaking is trying to emulate and completely fails at because there is no obj files to do the linking quickly so as a result Webpack loads the ENTIRE SOLUTION into memory to tree shake it and does a horrible job of it while it's at it. (which causes the heap memory overruns)

  2. The fundamental assumption of Webpack is outdated and irrelevant. The concept is fewer HTTP requests results in faster load times. This was true under HTTP/1 to some extent but not HTTP/2. And this is made even worse by the assumptions of Angular and how module loads work.

For instance: I use kendo's grid in many lazy loaded modules. (and this I just an example, I have yet to find js libraries for Angular 2+ that don't have this issue). Our main module is forced to load it otherwise you get errors when one child module is loaded and then a second one is. It should never have to load a dependency until such time as it's actually required and once it's loaded I should never have to load it again.

For instance: The assumption is that on first load, load ALL of the decencies (hopefully tree shaked). Except the vast majority of these vendor files are not needed to load a home page.

Our site's home page and all publically facing pages require:

bootstrap.css Angular core (including SSR) Our Sitewide app.component.css Our sitewide app.component.js The home pages's css and js

NOTHING else.

Except no matter how you slay this dragon (and I've tried the default templates using node.js instead of .net etc. too our home page takes 6.5 megs of data to be loaded (before gzip) for about 100k minimized but before gzip.

Even using HTTP/1 Webpack is a MASSIVE loss. The overhead of HTTP/1 is maybe 100k versus 6.3 megs of overhead care of Webpack and its assumptions.

HTTP/2's overhead? Almost 0 and they can all be loaded simultaneously because the compiler should load the page requested and that .js should have compiled all of the imports for everything it needs and loaded them. It should do so from linked defined dependencies. That is the .js should include all submodules of all dependencies so that they're all at the top of the .js file and everything will just work. It should have also split up mono-lithic libraries into small pieces to be loaded that are either unique to the page or shared by multiple pages.

With the exception of polling for changes in angular 1, in most cases Angular 1 will be VASTLY faster to load and work with pages than Angular 2 and Angular 1 still pre-loads stuff it shouldn't need to.

The assumption that modules should be loaded in bulk is just silly. The page IS THE MODULE. Yes your site has a shared module that is for the chrome and global dependencies but every page is it's own world. It may pull in other components that are shared across pages but the lazy loading should be per page because the user loads one page at a time. If they need more, they grab more. Lazy opportunistic loading could be used to load js in the background for pages not yet clicked on, but even that is likely a bad idea because people on 3mbit dsl with a few tabs open could have all of the bandwidth starved by pages doing this. (yes something like 60 million Americans still have 3 or less mbit/sec dsl and the rest of the world is VASTLY worse.) And right now if you try and create a module per page you're back to the above because you have to pre-load in your root assembly with .forRoot() most of your dependencies instead of them being loaded opportunistically.

All dependencies should be shared so that they're available immediately on the page to all subsequent requests, and all components and pages should be isolated so that they can't mess with other pages.

I would LOVE to see Angular 4 done without Webpack with an entrypoint .js file that loads just the basics and then loads the page required based on the routing and does so with every page lazy loaded individually. and works with SSR and no Webpack. I guarantee that for the vast majority of sites that first page load would be VASTLY faster and subsequent page loads wouldn't be noticeably slower either.

The perfect solution (to me) is that at debug time node_modules is used in the browser as well as on the server and at release, a compiled and limited node_modules is deployed that has nothing but the release files and everything else dumped. (And those release files should have all of the non-used code stripped and then the remainder minimized as part of the release process... if you want do this in debug too because if you had a properly compiler with incremental builds and linking it wouldn't be an issue to detect vendor changes)

Then the loader can load directly from there and those modules never need to be compiled or packed again (or packed ever)

The result would be that HMR could release the page or the component on the specific page that would be recompiled on save of any files associated with it, thus eliminating the slowness without a vendor.js file, so development would be speedy, and each page would load exactly what that page needed and nothing else. If it was already in memory it would use it, if not it would load it as part of the http/2 request package for the page resulting in almost no overhead or delays.

At 3 mbps that is 2 seconds saved to load our site's homepage and every page after that is perhaps 50-60k that gets loaded as needed instead of a monster so the user perceives almost 0 difference on subsequent page loads and a MASSIVE difference on first load.

This is how you build REALLY BIG SITES just like it's how you built really big desktop apps. (C# with mef and load the dependencies on demand). It's the lessons that apparently everyone forgot in the Angular/Webpack (or even react/Webpack) world.

Meanwhile, and while it's just a proof of concept the google people should be very concerned with it and LEARN FROM it, Blazor is < 600k loaded and they haven't even enabled linking yet and compiles almost immediately. Add on mvc.net routing that works client side and linking and you'll be able to load a site in less than 500k with modularized pages and get almost all of the functionality of C# running native in a browser as an SPA. This is because Sanderson is building on 45 years of development instead of trying to reinvent the wheel.

If you know of an example that does SSPR and loads each page on demand with shared dependencies I'd love to see it and I expect that the vast majority of people here would love to see it too, because Webpack is a plague on projects. It's both horribly behind history and horribly complex to the point where the vast majority of developers don't understand it and they don't have the week plus just to get good at it that's required by most devs (I'm the only person here allowed to work on it because no one else is safe on it nor really understands it and I have some of the best people in the industry working for me.)

JohnGalt1717 commented 7 years ago

@MarkPieszak

Here's a short list of javascript heap issues: https://github.com/angular/angular-cli/issues/5618 https://github.com/AngularClass/angular-starter/issues/1498 https://github.com/angular/angular-cli/issues/3163 https://github.com/angular/angular-cli/issues/3099 https://github.com/angular/angular-cli/issues/1652 https://github.com/angular/angular-cli/issues/3651 https://github.com/angular/angular/issues/12184

None of these as of about 3 weeks ago had solutions or usable workarounds. I haven't checked since then though.

Additionally if you use some google fu you'll find hundreds of stack overflow and other comments about it.

If this is a circular reference then like the .net compiler it should either automatically resolve it (99% of the time) or throw and tell you exactly where it is, not just blow up chasing an endless stack.

Also my point about Webpack is that we shouldn't need an injector/dependency loader per say. We need a glorified require.js and the compiler should provide the hints on what needs to be loaded at the top of the page's xxx.component.js file and that's it. All pages should be lazy and all pages should contain exactly the information necessary for the page to execute properly right at the top (which they basically do already in the imports just need to get all of the dependency graph in there too.) They key is to differentiate between global dependencies that get loaded on demand (not until a component actually needs them!) but are then automatically shared with all subsequent page loads, and components that are isolated.

If the compiler simply injected the dependency graph for the loader and then rolled it up to the primary parent page controller based on the routes this all gets solved.

MarkPieszak commented 7 years ago

Closing as of #376 being merged. @consigliory HMR is now all set within the project, as well as AoT etc.