Open bengl opened 4 years ago
Just a heads up that:
Tangentially (although related to the "Alternatives" section above), I've put together a hepler library https://www.npmjs.com/package/fast-native-fn to help folks use v8-fast-api-calls.h
when available, and still compile when it isn't.
Great to see some activity on FFI. I have been hoping for native FFI in Node.js since Node.js 0.10. I have a lot of packages that just spawn a native binary as that was more foolproof than a native addon. I have a lot of ideas for packages, but I put them on hold as they require certain system APIs and I don't want to deal with a native addon.
What does this mean? (Background)
Foreign Function Interfaces are ways to call into native (i.e. C/C++/Rust/etc.) code from a higher level language, in our case JavaScript. There are many reasons to call native code:
We currently have a few approaches available to us:
ffi-napi
, it's predecessornode-ffi
, and others)Userland FFI libraries typically depend on some C FFI library (such as
libffi
ordyncall
) to build up the native function call using the appropriate calling conventions dynamically, which isn't otherwise possible in C/C++. Another approach (taken by DragonFFI for example) is to JIT compile function calls.The advantage of userland FFI libraries is that they don't require the user to write any wrapper native code in order to use a given native function. The user just identifies the function they're trying to call and its signature.
The advantage of node addons is that they are typically quite a bit faster that FFI libraries.
New developments
A new API,
v8-fast-api-calls.h
is being added to V8, which enables V8 to call native code directly in optimized functions. While it's still incomplete, it's usable today onnode@master
or nightlies. It's currently used innode@master
to calluv_hrtime
as fast as currently possible.Since the fast calls happen only when the functions are optimized, the slow path must also be provided.
Experiments
In an almost quixotical quest for better FFI performance, I've done some experiments on this front. I've made a branch of my FFI library,
sbffi
which usesv8-fast-api-calls.h
and the performance was measurably good. Keep in mind it's still using an FFI technique (in this case usinglibtcc
to JIT compile a function call), wheras a better implementation might pass the native function directly in as the fast path.There are benchmarks provided. I can post results here if desired, but YMMV, and I don't think they're representative of the best we can do.
Why in Node.js core? Why now?
Since the fast API calls are now available, it may time to investigate whether they can be used to build an FFI system in Node.js core. I think it's possible to build it so that it approaches (or maybe even surpasses) the speed of node addons. With that kind of speed available, it makes sense to have first-class support for it, alongside the existing support of addons. This will enable end users to use native libraries without requiring them to write non-JS code do so, or have any extra compile step.
I can imagine this simplifying calling out to things like
libsass
.Challenges
sbffi
, but that carries some performance cost.libffi
ordyncall
, or JIT-compiling a call (i.e. withlibtcc
orclang
or something like that), or building the call manually in handwritten assembly. I don't know if it's possible to work around this, but I'm pretty sure it would need to be handled by V8.Alternatives
Obviously Node.js has been chugging along just fine without FFI in core, so not doing anything like this at all is certainly an option.
On the WG call during the June 2020 collab summit, @mhdawson brought up the idea of enabling node addons to more easily use the fast API calls. This could be done by offering a new C++ API to identify "fast call" versions of exported native functions. This would serve the goal of getting
v8-fast-api-calls.h
into the native code calling path, but wouldn't give the ease-of-use that we get from more FFI-style libraries, where no native code specific to Node.js needs to be written by the end user. It's probably worth considering this approach whether some kind of FFI ends up being included in core or not.