BackburnerJS / backburner.js

A rewrite of the Ember.js run loop as a generic microlibrary
MIT License
392 stars 80 forks source link

[Proposal] Async stack traces. #340

Closed adriancooney closed 10 months ago

adriancooney commented 6 years ago

This PR introduces async stack traces to Backburner so it's easier to debug the runloop by allowing the developer to step back through the asynchronous stack frame.

Before / After Stack Trace for this demo code

before-after-backburner-stack-trace

gif of stepping though demo call stack

After some cursory research, it doesn't look like there's an API for custom async stack traces available in Chrome. It was actually requested from what looks like an Ember developer however that was back in 2014 and there hasn't been much activity on it since. This only left the hacky approach of emulating what Chrome thinks is an async action via Promises. I've created a new method on the Backburner class called _asyncExecute that takes a method and returns a new Promise's resolve method (not the promise itself!) synchronously. When that returned resolve method is called by Backburner (as if it was the method itself), the promise chain advances and calls the original method passed to _asyncExecute. This tricks the Dev Tools into thinking some asynchronous action happened when in fact it's just a simple callback.

Thankfully we only need to patch two methods, schedule and later. later doesn't entirely need to be patched because Chrome can accurately represent it's async operation as it's a setTimeout however it also gets scheduled into the default queue for execution once the timeout completes which is two async hops. With patching later, we reduce this to one async hop for clarity. All of this of course only happens when Backburner#DEBUG is true. Finally, there is no breaking API changes.

The method is a little hacky and the Promise.then (async) in between async stack frames in the sidebar Call Stack (see gif) is a little misleading but I'd consider this a minor setback for an increase to debuggability and productivity. One of my biggest gripes with Ember is how frustrating it is to debug the runloop.

I expected a performance hit to Backburner in debug mode because of the promises ontop of the current stack stitching system but to my surprise, there's barely any. I suspect it's the new Error call that is causing such a performance hit between non-debug and debug modes. When Firefox and Safari (and Edge?) supports async stack traces, maybe we could remove that call altogether and get DEBUG performance to a reasonable level?

  Benchmark                                                                            |     master      |      PR
  ------------------------------------------------------------------------------------------------------------------------
  DEBUG - Schedule & Flush - function ................................................ | 126,951.07 op/s | 112,396.69 op/s
  DEBUG - Schedule & Flush - target, function ........................................ | 126,113.43 op/s | 118,414.02 op/s
  DEBUG - Schedule & Flush - target, string method name .............................. | 130,435.87 op/s | 125,710.38 op/s
  DEBUG - Schedule & Flush - target, string method name, 1 argument .................. | 133,912.92 op/s | 128,377.54 op/s
  DEBUG - Schedule & Flush - target, string method name, 2 arguments ................. | 132,708.16 op/s | 129,048.85 op/s
`node bench` on master

```python [backburner.js] (master) $ node bench testing - Debounce - function - Debounce & Cancel - function, target - Later & Cancel - function - Later & Cancel - function, target - Schedule & Cancel - function - Schedule & Cancel - target, function - Schedule & Cancel - target, string method name - Schedule & Cancel - target, string method name, 1 argument - Schedule & Cancel - target, string method name, 2 arguments - Schedule & Cancel - prescheduled, same queue - target, string method name - Schedule & Cancel - prescheduled, separate queue - target, string method name - Schedule & Flush - function - Schedule & Flush - target, function - Schedule & Flush - target, string method name - Schedule & Flush - target, string method name, 1 argument - Schedule & Flush - target, string method name, 2 arguments - DEBUG - Schedule & Flush - function - DEBUG - Schedule & Flush - target, function - DEBUG - Schedule & Flush - target, string method name - DEBUG - Schedule & Flush - target, string method name, 1 argument - DEBUG - Schedule & Flush - target, string method name, 2 arguments - Throttle - function - Throttle & Cancel - function, target running first test, please wait... Debounce - function ................................................................ 324,055.10 op/s Debounce & Cancel - function, target ............................................... 291,506.86 op/s Later & Cancel - function .......................................................... 333,466.75 op/s Later & Cancel - function, target .................................................. 330,913.76 op/s Schedule & Cancel - function ..................................................... 4,526,880.55 op/s Schedule & Cancel - target, function ............................................. 2,215,570.12 op/s Schedule & Cancel - target, string method name ................................... 2,166,164.74 op/s Schedule & Cancel - target, string method name, 1 argument ....................... 2,008,147.17 op/s Schedule & Cancel - target, string method name, 2 arguments ...................... 1,941,816.72 op/s Schedule & Cancel - prescheduled, same queue - target, string method name ........ 1,187,647.98 op/s Schedule & Cancel - prescheduled, separate queue - target, string method name .... 2,103,827.47 op/s Schedule & Flush - function ........................................................ 512,620.24 op/s Schedule & Flush - target, function ................................................ 497,121.32 op/s Schedule & Flush - target, string method name ...................................... 484,099.56 op/s Schedule & Flush - target, string method name, 1 argument .......................... 485,970.45 op/s Schedule & Flush - target, string method name, 2 arguments ......................... 480,963.27 op/s DEBUG - Schedule & Flush - function ................................................ 126,951.07 op/s DEBUG - Schedule & Flush - target, function ........................................ 126,113.43 op/s DEBUG - Schedule & Flush - target, string method name .............................. 130,435.87 op/s DEBUG - Schedule & Flush - target, string method name, 1 argument .................. 133,912.92 op/s DEBUG - Schedule & Flush - target, string method name, 2 arguments ................. 132,708.16 op/s Throttle - function .............................................................. 4,038,990.57 op/s Throttle & Cancel - function, target ............................................. 1,778,177.93 op/s fastest: Schedule & Cancel - function ```

`node bench` on this PR

```python [backburner.js] (adrian.cooney/async-stack-traces) $ node bench testing - Debounce - function - Debounce & Cancel - function, target - Later & Cancel - function - Later & Cancel - function, target - Schedule & Cancel - function - Schedule & Cancel - target, function - Schedule & Cancel - target, string method name - Schedule & Cancel - target, string method name, 1 argument - Schedule & Cancel - target, string method name, 2 arguments - Schedule & Cancel - prescheduled, same queue - target, string method name - Schedule & Cancel - prescheduled, separate queue - target, string method name - Schedule & Flush - function - Schedule & Flush - target, function - Schedule & Flush - target, string method name - Schedule & Flush - target, string method name, 1 argument - Schedule & Flush - target, string method name, 2 arguments - DEBUG - Schedule & Flush - function - DEBUG - Schedule & Flush - target, function - DEBUG - Schedule & Flush - target, string method name - DEBUG - Schedule & Flush - target, string method name, 1 argument - DEBUG - Schedule & Flush - target, string method name, 2 arguments - Throttle - function - Throttle & Cancel - function, target running first test, please wait... Debounce - function ................................................................ 337,010.46 op/s Debounce & Cancel - function, target ............................................... 293,808.65 op/s Later & Cancel - function .......................................................... 344,887.98 op/s Later & Cancel - function, target .................................................. 328,991.84 op/s Schedule & Cancel - function ..................................................... 4,501,900.81 op/s Schedule & Cancel - target, function ............................................. 2,206,946.83 op/s Schedule & Cancel - target, string method name ................................... 2,129,333.73 op/s Schedule & Cancel - target, string method name, 1 argument ....................... 1,800,934.46 op/s Schedule & Cancel - target, string method name, 2 arguments ...................... 1,745,172.19 op/s Schedule & Cancel - prescheduled, same queue - target, string method name ........ 1,130,978.88 op/s Schedule & Cancel - prescheduled, separate queue - target, string method name .... 1,890,185.90 op/s Schedule & Flush - function ........................................................ 480,524.04 op/s Schedule & Flush - target, function ................................................ 491,834.57 op/s Schedule & Flush - target, string method name ...................................... 496,861.43 op/s Schedule & Flush - target, string method name, 1 argument .......................... 482,918.45 op/s Schedule & Flush - target, string method name, 2 arguments ......................... 476,500.03 op/s DEBUG - Schedule & Flush - function ................................................ 112,396.69 op/s DEBUG - Schedule & Flush - target, function ........................................ 118,414.02 op/s DEBUG - Schedule & Flush - target, string method name .............................. 125,710.38 op/s DEBUG - Schedule & Flush - target, string method name, 1 argument .................. 128,377.54 op/s DEBUG - Schedule & Flush - target, string method name, 2 arguments ................. 129,048.85 op/s Throttle - function .............................................................. 3,968,894.84 op/s Throttle & Cancel - function, target ............................................. 1,743,115.59 op/s fastest: Schedule & Cancel - function ```


I've stuck up a page on Github that gives you a demo. Be sure to open Dev Tools and stick a breakpoint in done.

http://adriancooney.ie/backburner.js/dist/demo/

Todo:

adriancooney commented 6 years ago

Thanks for the review @rwjblue! Can you elaborate on your thinking with the ordering guarantees? I had some suspicions myself since Promise execution isn't sync but the tests passed so I figured it might have been alright. Can you think of any ideas on how to design a test case for it?

ef4 commented 6 years ago

I don't think this will break anything as long as Promise.then is using proper microtask timing. But I don't think all our supported browsers have correct microtask timing when you use Promise.then. That's why backburner itself is using MutationObserver instead when it needs a microtask wait.

Does dev tools stitch together async frames across backburner's platform.next? If so, that would be great and you could just use that directly, and we'll know we have good timing on all browsers.

I also worry that it's risky to change the timing behavior when the debug flag is on vs off. It seems possible that merely turning on DEBUG would make your bug disappear, or make new bugs appear. For example, this change is enough to alter the behavior of a bug like https://github.com/BackburnerJS/backburner.js/issues/332. I would feel better if we just confirmed that there's no performance hit and made this the permanent behavior. I would not expect a performance hit as long as these are native microtasks.

rwjblue commented 6 years ago

I had some suspicions myself since Promise execution isn't sync but the tests passed so I figured it might have been alright.

Tests do not currently run with DEBUG on and off (we should fix that though)...

rwjblue commented 6 years ago

I also worry that it's risky to change the timing behavior when the debug flag is on vs off. It seems possible that merely turning on DEBUG would make your bug disappear, or make new bugs appear. For example, this change is enough to alter the behavior of a bug like #332. I would feel better if we just confirmed that there's no performance hit and made this the permanent behavior. I would not expect a performance hit as long as these are native microtasks.

In general, I agree. I think the issue here would be that I don't think we can change schedule to always be async (we can only do it if we are being flushed by an autorun)...