patrick-steele-idem / morphdom

Fast and lightweight DOM diffing/patching (no virtual DOM needed)
MIT License
3.17k stars 127 forks source link

Walking through the DOM is slow #2

Closed trueadm closed 9 years ago

trueadm commented 9 years ago

You state that walking through the DOM isn't slow, but in reality it really is. Accessing things like firstNode or childNodes is vastly slower than storing a light-weight representation of it (aka virtual DOM).

Furthermore, handling of keyed nodes becomes problematic when using DOM nodes. There simply isn't any performant way of doing this when using the DOM as the source-of-truth.

Have you see this too? http://vdom-benchmark.github.io/vdom-benchmark/

Furthermore, you shouldn't really be comparing yourself to the virtual-dom library and stating that library as the de-facto virtual DOM standard, it's one of the slowest implementations of virtual DOM out there in Safari (it's noticeably faster than morphdom in Chrome according to your own benchmark).

patrick-steele-idem commented 9 years ago

Hi @trueadm, I spent a lot of time benchmarking and profiling the code for morphdom and other libraries. Most micro benchmarks like the one you shared don't capture the complete story and can be misleading. Micro benchmarks tend to inflate things that really don't matter in practice. We should definitely pay attention to micro benchmarks, but what matters more is how things perform in a real application. I completely concede that morphdom will often be a slower than a similar virtual DOM based solution (although morphdom was often faster as well...).For morphdom I found that the slowest part of the algorithm was actually looping over the attributes on a DOM node (which seems like a failure of the browser). While I was initially discouraged I went on to to do further testing to see how things performed in a more real world situation. For that, I looked at the entire rendering and updating pipeline (it matters how long it takes to render the new DOM nodes and it also matters how long it takes to update the existing DOM).

I chose to do a direct comparison with React due to its popularity. I paired morphdom with Marko Widgets and built a real world application of rendering a page of 100 search results. I found that for server-side rendering the Marko Widgets app was 10x faster than React (which I think is huge and very significant) and that was largely due to the fact that it is faster to stream an HTML string than to produce virtual DOM nodes that have to be serialized. In the browser I found that Marko Widgets was able to update the DOM in almost the identical time as React (varied by browser...). Please take a look at those results if you haven't already: Marko vs React: Performance Benchmark

I honestly think as browsers evolve the diffing of a real DOM will continue to get much faster. As mentioned in the docs for morphdom, I do believe there are a lot of other extra benefits that go with avoiding the virtual DOM abstraction. Hopefully browser vendors will continue to improve the performance of the DOM, but in the mean time I am still very much satisfied with the performance of morphdom.

trueadm commented 9 years ago

You're right in that (well we both hope) browsers will get better at the DOM abstraction and improve performance there. In regards to the micro-benchmark talk, I don't believe that's entirely the case – the fact of the matter (which no one can even debate) is that creating and accessing literal objects is the most efficient way to handle anything in JavaScript. React was a poor implementation of that because it introduced large prototype chains and never really considered the realism of real world applications in terms of static vs dynamic.

The problem is, unfortunately, is you're replicating the actions of a virtual DOM but using the DOM as the source-of-truth. This isn't needed and isn't ever the case in terms of real applications. Take my experience and environment (I write applications for the financial trading market) where around 60% of the DOM is static – this is actually very low too. Although take the dbmonster benchmark, you'll see that only 35% is static; yet still if you conceptualise the "application" rather than nodes in the DOM, you'll notice that most is broken up in terms of "fragments". Take this concept (as my Inferno library does) and you'll see vast performance benefits over anything out there (by a huge margin too).

I feel too many developers are jumping out there with one agenda simply to attract a certain crowd. This is not only damaging our scene, but it's not even remotely productive. There's value in mixing both virtual DOM (ignore the coined phrase, there's much more to it at low level) and the actual DOM itself; let's not distance ourselves from our true objective and that is to make web applications faster.

patrick-steele-idem commented 9 years ago

In our case at eBay we are using a high performance templating engine (Marko) that works really well on both the server and in the browser. Marko, of course renders HTML, not a virtual DOM. We toyed with the idea of updating the Marko compiler to produce compiled templates that render virtual DOM nodes (or incremental DOM), but we felt the performance would have suffered based on early metrics. Our motivation for creating morphdom was to be able to utilize DOM diffing for updating the DOM with the minimal number of changes and to have it work with a templating engine that renders HTML. We found that a DOM diffing solution based on the real DOM provided the best of all worlds for us (great server-side performance, great client-side performance and smart updates to the DOM). Maybe morphdom is not for everyone and that is fine... Would I use morphdom if a large part of the was being DOM rendering every 10ms? Probably not. However, that is not our use case and I doubt that is a use case for 99% of applications. On a related note, if a UI component framework makes it easy to only render parts of the DOM that may have changed then the performance of the DOM diffing solution matters even less.

I hope that provides more context for why morphdom was created and some of my motivations. I welcome any discussion on morphdom or any other DOM diffing/patching solution and, more importantly, I am always very interested to see how they are integrated into real applications. Thank you for providing your feedback and sharing your experience. We should continue to evolve technologies in this space (how about Virtual DOM to real DOM diffing?) and I look forward to more discussions.

kesne commented 9 years ago

I think @patrick-steele-idem said about performance is really important. It's entirely likely that Morphdom is not the fastest diff/patch library out there, but that really doesn't matter for the practical use case.

I'll give the example of how I'm currently implementing it, and hopefully that will give some insight.

I've been working on a project for the last year that is built on a HTML string renderer (Dust.js). While we would love to change to a new templating engine that returns virtual-dom, it isn't really practical for the size of our application. What we currently do is parse our HTML string into a DOM node, then convert it into a virtual-dom representation.

This is where Morphdom can give us a performance boost. The virtualize step from the DOM node to VDOM already requires us to walk all of our nodes and visit all of our attributes, so putting that step into the same diff step just makes sense. In fact, we've been working on a library almost identical to this one for the last few weeks just to try to increase our performance.

trueadm commented 9 years ago

I can also give a use-case where React wasn't fast enough for us – and no other framework was either. I work in the world of financial trading applications, where a price being off by 250ms is a critical bug (it could cost banks millions if off). Thus it's massively important for us to ensure our applications (on the web) update as fast as practically possible. When you have to update 250-500 elements on a page per 250ms, it becomes somewhat a burden for many of the implementations out there.

You might say this is an isolated problem for us, but it's not at all. Almost every single trading application on the web has to either limit updates to 1s (which clients are unhappy about) or code their entire trading price layer in vanilla JS implementations. Given that trillions of dollars are passed through trading applications everyday, this is a very much a real-world problem that countless banks and tech companies have tried to solve.

If the real reason for morphdom is that it's designed to support legacy applications still using jQuery (which is understandable) then that is completely different from the pitch in the readme where it comes over as morphdom being the best in class for performance compared to virtual DOM.

kesne commented 9 years ago

@trueadm That may be the case, but then I would say that morphdom isn't the library for you? I have no affiliation with the project at all so take that with a grain of salt, but if your application needs that level of performance then I would say look elsewhere.

trueadm commented 9 years ago

@kesne this topic was in regards primarily to where the readme states that "No, the DOM data structure is not slow.". I completely disagree with this statement. The DOM data structure is very slow, regardless of how and when you access it. I know this from my experiences building Inferno – which as as I stated above, is a very unique twist on the whole "virtual DOM vs DOM" problem.

patrick-steele-idem commented 9 years ago

@trueadm, you definitely have a very demanding use case. I would argue if performance is of the utmost concern then I wouldn't even using a DOM rerendering and DOM diffing strategy and would instead use the old school way of manually updating the DOM (which you can always make be the fastest if you really try).

We've also run into situations where there is a hot code path for updating part of the DOM and we needed that code to be as fast as possible. We chose to offer a clean solution for reverting to manually updating the DOM to avoid the cost of DOM rerendering and diffing in the extreme case where performance really mattered. See Stateful Widget with Update Handlers

I think we can agree that every application is different and there are multiple ways to solve performance problems. I'm glad your virtual DOM solution has worked well for you and I am glad you are trying to make the web faster. Sharing ideas and solutions will ultimately be better for everyone.

trueadm commented 9 years ago

@patrick-steele-idem indeed, this very much the reason why I'm even bothering to comment on this. It's not to say "hey your code is shit", which is the common criticism on GitHub these days. It's more to point out some flaws in a) your readme's statement regarding handling of the DOM, b) your performance in regards to the handling of keyed nodes and most importantly c) to talk and discuss.

We are a collective. We're intelligent people working on a common problem. Rather that simply give the "hey this library isn't for you, maybe you should do X instead" and avoiding the use-case problem we should be collaborating and working on a use-case that does fix the problem in 99% of all cases, rather than assume our solutions automatically fit 99% of all use cases.

I'm not alone on this conquest, many others are too. I guess people are simply fed up with being told "this tool is not right for you, you instead must integrate tool A, tool B, tool C, then write your own tool D and then you'll be okay! Bye!". Funnily enough, it seems that the JavaScript community has a contagious disease where this is completely acceptable to state – you won't find that in other programming communities (Rust is a very good example).

patrick-steele-idem commented 9 years ago

@trueadm "slow" is not well defined and will depend on the application. The DOM might be considered slow in your use case, but I don't consider it slow in our use cases. You might be correct that real DOM diffing is slower than virtual DOM diffing, but there are a lot of other things at play (different browsers, size of libraries, UI frameworks, templating engines, server-side rendering vs client-side rendering, etc.).

trueadm commented 9 years ago

@patrick-steele-idem true, but typically lists of data (the most common use-case for any useful tempting engine really) depends on the data model being sorted. If that data model changes regularly and its keys (or sorted items) change often, then you have a difficult problem to solve. Like I said in my email, this a problem that is far bigger than a GitHub issue and it involves many competent minds to fix, thus why there's a Skype group setup that I'd love you to join (and others!).

It might not be a big use-case for eBay, but why limit our implementations to simply what our companies desire, why not make our implementation scale to what other companies might desire too?

patrick-steele-idem commented 9 years ago

Sounds good @trueadm. Let's discuss over Skype. I'm also really performance minded so your concerns/comments are definitely not falling on deaf ears. I'm going to close this issue because I don't think there is anything that we can act on for this project (except maybe update the README to address concerns?), but I'm not closing it to end the conversation.

DarkMarmot commented 9 years ago

@patrick-steele-idem I can't wait to see if I can incorporate this into my framework -- component-based like React but with a declarative, reactive system and no virtual DOM. It was built primarily to allow large teams to compose interactive data visualizations (so D3 is in heavy use).

@trueadm I don't understand your use case. I come from a background in video game programming... where updating 500 elements in 250ms is simply nothing for modern hardware. If that's the most important aspect of your application, even sticking with a browser it should be straight-forward to move to canvas with monospace fonts and hit 60fps. Why even bother with the DOM if render speed is of such importance?

trueadm commented 9 years ago

@DarkMarmot Two problems exist there:

On that note though: If all the major banks were happy with having the same application styled in the same theme, my life would be vastly simpler!

DarkMarmot commented 9 years ago

@trueadm ouch. enough drop shadows and layered transparencies in a client theme could devastate any system you devise... and IE8, wow. Thankfully, even the conservative Fortune 100 I work at uses Chrome now. Props to you for devising solutions there, but I would go crazy dealing with IE :)

ioquatix commented 4 years ago

It's 5 years later. Do the assumptions in these discussions still hold true? Is the DOM faster? Slower? Did virtual dom get better? Relatively? Absolutely? I really appreciate the discussion here and so it would be interesting to see how things have changed in browsers over the past 5 years.

Bobris commented 4 years ago

Not much changed. Real DOM complexity slowness is inherent in its design, so it cannot be really faster. trueadm now works on React. But React raw speed in 5 years improved only very slightly, it is not goal really. Worst IE you need to support is IE11, much better than IE8 or IE9, but still crap.

ioquatix commented 4 years ago

morphdom still seems to serve an important purpose though. Not everyone wants to maintain a virtual DOM. It's nice that it's still maintained.

snewcomer commented 4 years ago

For context, this library is at the core of Phoenix Live View (LiveWire for Laravel). Total diff is much smaller than equivalent SPA apps since users server side render their page and sprinkle in small reactive parts into page. This makes this library a perfect candidate since v-dom only wins once I/O is swamped (due to large changes). Otherwise real DOM is better/on par with v-dom.

https://github.com/phoenixframework/phoenix_live_view

ioquatix commented 4 years ago

@snewcomer do you have any benchmarks or comparisons? Not that I expect to take much away, just interested to understand where we stand in terms of both absolute and relative performance. I did watch the phoenix live view 60FPS demo and was impressed.