Open SimenB opened 4 years ago
There is nothing node can do to help performance here. The problem is that V8's global proxy design forces the embedder to indirect through c++ for every lookup, which adds a lot of overhead.
And there's no way to cache/circumvent that lookup i node? If not, do you think there's any chance of v8 making changes so we can work around it?
Or are there any workarounds similar to the current workaround we can for SourceTextModule
(or whatever the ESM vm API ends up being)?
@SimenB You could do const context = createContext(); context.console = console
instead of const context = createContext({ console })
.
In the long term V8 might need to refactor this system anyway because of the upcoming realms API, so maybe one day createContext({})
will be fast too.
@devsnek I mentioned that in the OP - it takes the time from 1600ms to 800ms (so halves the time), but it's still 2 orders of magnitude slower than injecting it (which is less than 10ms).
I see no difference between createContext({ someGlobal })
and createContext().someGlobal = someGlobal
- both are twice as fast as not assigning the global at all, but still suuuuper slow compared to running outside of vm
all together.
@SimenB it seems that at some point we started creating the internal proxy unconditionally, which is why you're not seeing a speedup with what i suggested.
Using the vm.Context
constructor from #30709 also fixes this. Hopefully I can get that merged soon.
Ah, wonderful!
I just compiled your branch and can confirm it does indeed fix the issue 😀
🤞 you're able to land it! Might be a bit premature, but do you think it'll be backported to v10 and v12? Since it's a different API graceful degradation should be fine, but the free performance boost for all release line would be wonderful (and Jest could remove the option we added to allow people to work around this)
Fixed on 4725ac61c885b798fd3e1f8416bb1f6b8d0b6af4 :)
Sorry, @SimenB is this issue solved?
I don't think so, #30709 was closed unmerged?
Please land it, it would make me very happy 😀
Also note that this issue is known as far back as 2016: https://github.com/nodejs/benchmarking/issues/75#issuecomment-262740252.
Considering it was 1,5 years ago https://github.com/nodejs/node/pull/30709 was closed in favor of the realms API, has the realms API landed or should @devsnek's PR still be worked on?
Is your feature request related to a problem? Please describe.
Accessing globals from within a
vm.Script
is guarded by interceptors which makes accessing them slow, (from my understanding, @fhinkel has an excellent comment explaining some of this here: https://github.com/facebook/jest/issues/5163#issuecomment-355509597). The suggested workaround is to evaluate and cache the global you want access to (Math
in my example case below) and inject that into thevm.Script
. This is an acceptable workaround forvm.Script
andvm.compileFunction
but it cannot be done when using ESMvm.SourceTextModule
as there is no wrapping function call.I've put together this script you can run to see the numbers:
Running this gives the following results on my machine:
So ~1600ms if not using the workaround, which reduces it to ~7ms.
Describe the solution you'd like
I'd like for accessing globals to be as fast, or as close to as possible, as fast as accessing globals outside of
vm
. Would it be possible for Node'svm
implementation to cache the global property lookups, so that the price is only paid once instead of on every single access?Describe alternatives you've considered
As mentioned, caching and injecting the global manually works when there is a function wrapper, but this doesn't work with ESM. I've tried assigning the
Math
global to thecontext
, and while that halves the time spent (~800ms), it's still 2 orders of magnitude slower than injecting a reference.