nodejs / TSC

The Node.js Technical Steering Committee
573 stars 127 forks source link

Strategic initiative: future of Node.js build toolchain #901

Closed mmarchini closed 2 years ago

mmarchini commented 4 years ago

This was brought up several times, most recently on the Build IRC channel. Our current build toolchain is based on GYP, which is not the default toolchain for V8 anymore and was discontinued by Google. As a result, we have to maintain our own GYP fork, and keeping dependencies (especially V8) up to date requires considerable manual labor. Furthermore, the build experience for Windows and Unix is extremely different, with different config files for those platforms (which makes it hard to keep them in sync feature wise).

Switching to a modern, widely used build toolchain like CMake has been brought up a few times. There were also suggestions to move to GN, which comes with a completely different set of tradeoffs. In the recent discussion on the Build chat I suggested we should investigate an hybrid mode where V8 is built with GN and the rest of the project is built with CMake* (this also would come with a completely different set of tradeoffs). And this is only talking about building Node.js itself, we also need to evaluate how each toolchain will impact native modules, and we'll need to provide gyp + something else for some time on the native modules side.

I suggest we create a strategic initiative to start planning, exploring options and working on the future of our build toolchain. This could potentially become a working group in the future depending on the amount of work we have ahead as well as the number of collaborators interested in helping. I'm willing to champion this initiative.

* I would really like to explore this one because if it works, we can provide pre-built V8 binaries which can be downloaded by collaborators, reducing significantly the build time of the project

mhdawson commented 2 years ago

On the GN front, what I remember is that google never committed to it being used by other projects. When we asked if it was supported for use in our own projects, the answer was always No. That was a few years ago, not sure if it has changed.

mhdawson commented 2 years ago

@bnb maybe if you can ask the people you know who are doing the GN work about the current stance/recommendation from google in its use in other projects?

mmarchini commented 2 years ago

FWIW my opinion stands that the best long term solution from a maintainability perspective would be a hybrid approach where we build V8 with GN and everything else with CMake. That would allow us to provide prebuilt V8 static library binaries to be used by Node.js core developers, cutting the build time significantly and making our whole build process more simple. That's a close approach to what deno does (although deno takes advantage of Cargo for the prebuilt V8, since it's written in Rust). There will be trade offs with this approach but so will any other approach. And according to Google developers GN can "easily" be extended to other architectures and platforms.

hashseed commented 2 years ago

Disclaimer: I no longer work on V8.

V8 currently maintains both a GN build and a Bazel build. The former is used for regular build with Chromium. The latter is used mostly for Google-internal use cases of V8.

The V8 team also continues to use a GN port of Node.js to test V8 against Node.js in CI. There is also a Fuchsia build of Node.js based on this GN build if I'm not mistaken.

Before I created the partial GN port for Node.js, we used to have a hybrid build with V8 being built via GN, and Node.js built via GYP, taking the V8 binary as a build dependency. That lived under the GYP flag --build-v8-with-gn. Maintaining that configuration was a nightmare since GN does not directly translate to GYP and while GN constantly introduced subtle changes, GYP stagnated.

@victorgomes is the authority on V8 team's investment in Bazel and Node.js these days.

Some additional reading material: The case for porting Node.js to Bazel I gave a talk on switching to GN port of Node.js in 2019 on the Node.js collaborator summit in Berlin GYP deprecation and Node.js, this exact topic, and a few potential solutions, dated 2018

victorgomes commented 2 years ago

As mentioned by @hashseed, the GN port of Node.js is only used to test V8 against Node in our CI and it is currently pretty limited (only supports Linux and Mac on x64). We have an experimental bot for Windows on x64, but we never really put the time to make it work. The port has still a major flaw right now, it uses Node GYP internal scripts to build native modules.

The build is only maintained by @pthier and me. It runs with V8 and Node ToT. It could be improved and extended with the help of Node folks. The advantage is that we could easily add more trybots in V8 CI (running Node in different architectures supported by V8 for instance). Unfortunately, V8 cannot promise to maintain a GN build if Chrome decides to change the build system again. GN is however pretty stable and currently used by Chrome and Fuchsia.

The Bazel port of V8 is also limited to Google's internal usage (it is a subset of V8 + few archs + couple of OSes). It could also be extended, but it would require considerable changes in V8's CI infrastructure.

mhdawson commented 2 years ago

I talked to @miladfarca who is part of the team that keeps V8 running on PPC and s390. As part of that work he helped get GN working on those platforms and associated operating systems.

Based on our discussion my current thinking is in agreement with:

FWIW my opinion stands that the best long term solution from a maintainability perspective would be a hybrid approach where we build V8 with GN and everything else with CMake.

ie. Build V8 with whatever it uses which is currently GN and then link the resulting static binary into Node.js itself.

To maintain the simplicity that we have today where you can just git clone/make to build and test Node.js we would probably have to:

The feedback on Bazel by the team having worked on an issue in envoy was that it was not that easy to use and would likely add headaches on the operating systems it does not support directly. It could probably run/work anywhere Java runs, but it was mentioned earlier in this thread that it bundles in/hides the JVM but I doubt that is the case for a number of our platforms (SmartOs, AIX, Linux on Z, Linux on P, etc.)

yuriy-yarosh commented 2 years ago

I'm working on node.js Bazel integration right now for my R&D project. In short, bazel can be packaged as a single binary with GraalVM, so JVM adoption concern is a bit misplaced. GraalVM has an optional LLVM backend.

The most common bazel issue I was facing for the last 2 years was cache inconsistency, but google is fixing it "from time to time" and cache is considered mostly stable, nowadays. Everything else looks solid.

K8S dropped Bazel because they literally had no contributors who were willing to get over Config and Support complexity. Skylark/Starlark python dialects can be fairly cryptic and entry barrier should've been simpler and easier. There's slight lack of proper documentation and examples.

I'm thinking about providing a short Bazel course for the newcomers, so more people could adopt it and overcome their FUD.

manekinekko commented 2 years ago

Hi, I saw many mentions to Bazel in this thread so I'd like to chime in and add @alexeagle and @gregmagolan to this conversation. Together with Alex and Greg, we worked on the node.js rules for Bazel (myself as a contributor).

I'll let Alex and Greg share their insights (as former Googlers) about Bazel in this context.

@mhdawson if the decision has been made, then please disregard my comment.

mhdawson commented 2 years ago

@manekinekko we don't have a path forward decided yet so always open to more info.

anlexN commented 2 years ago

python team says python will not be iterated in the future, so nodejs don't use python, please.

mmarchini commented 2 years ago

The plan is to steer away from custom solutions (like we currently have). That might mean an existing, well maintained toolchain that uses python (although unlikely based on current candidates). Can you elaborate on "python won't be iterated in the future" (and if possible, share a link to the announcement)?

anlexN commented 2 years ago

Python 4.0 will never arrive🤚😔

nschonni commented 2 years ago

Here is one that is easier to read https://www.techrepublic.com/article/programming-languages-why-python-4-0-will-probably-never-arrive-according-to-its-creator/, but it's just saying they don't want a repeat of the 2 to 3 migration pains, not that it isn't supported or that they aren't continuing to develop features

ryzokuken commented 2 years ago

I think the discussion about the future of Python is atleast somewhat derailing from the matter at hand: irrespective of the future of Python, we should investigate an alternative to gyp-next.

mmarchini commented 2 years ago

I compiled the information from this issue and a few other links into the PRs/files below:

I would love for people to share feedback and contribute on those. It's worth noting that those are all specific to building Node.js itself. I'll try to write something for native modules next.

Since this thread got bit and took a few detours, I'll close this and move all further discussion to the https://github.com/nodejs/build-toolchain-next repository. If anyone thinks this issue should remain open please let me know.

Mesteery commented 2 years ago

I'm a bit late, but what do you think of xmake?

mmarchini commented 2 years ago

We haven't reached a decision, so it's not late. Can you share a bit more about xmake, what makes it unique compared to the other options, how widely adopted it is, and what would be the pros and cons of using it to build Node.js?

SirLynix commented 2 years ago

xmake is still a bit young and not as widely adopted as other build systems, however it sure is one of the most interesting build system (imo).

It has a way nicer syntax than for example cmake:

target("library")
    set_kind("static")
    add_files("src/library/*.c")

target("test")
    set_kind("binary")
    add_files("src/*.c")
    add_deps("library")

It has an integrated package manager system which will try to find dependencies on your computer but is capable of building them from source itself if it doesn't find them, which means:

add_requires("libsdl")

target("test")
    add_files("src/*.c")
    add_packages("libsdl")

this will try to find libsdl in your installed package manager (apt, conan and such) and if it cannot find it will download libsdl and build it from source.
In the C and C++ world, this is new (and awesome). XMake is to C++ what cargo is to Rust.

example from one of my project:

It also is capable of using other build systems (cmake, meson, etc.) if required, switching to xmake doesn't mean "remake all your build system".

Here's an article written by xmake author which introduces its other features.

As expected from a build system, it's capable of generating VS projects, Makefile, CMakeLists.txt, and more, but it doesn't depend on it.

My personnal point of view: I used cmake and premake in the past, and switched to xmake one year ago, as soon as I heard about it.
It's not perfect, but it sure is way better than everything I tried before, and every issue I ran into has been quickly fixed.

Also, it's possible to use xrepo (xmake packages) in other build tools, such as cmake: https://github.com/xmake-io/xrepo-cmake.

mmarchini commented 2 years ago

Interesting, I think it's worth investigating. It claims to support GN which if it works properly would be a huge advantage. I'll take a closer look tomorrow and might try playing with it

eli-schwartz commented 2 years ago

xmake is also packaged by zero linux or BSD distributions.

It has a name clash with http://apollo.backplane.com/xmake/index.shtml, which is packaged by FreeBSD, OpenBSD, Mageia and Rosa, so likely cannot be packaged there or will require non-portable renaming to solve the clash. OTOH, it's not like the original xmake (a make variant) is actually popular, so that may not be a big deal.

It is packaged by MSYS2 on Windows, solely because the xmake developer contributed a PKGBUILD for it.

xmake has a "who is using it" list, I don't recognize a single name on that list: https://xmake.io/#/about/who_is_using_xmake

Perhaps nodejs can break new ground there, though. New build systems that compete against CMake's absolutely atrocious 1980s POSIX-bourne-shell-style "everything is a string, even arrays are ;-delimited strings" scripting language can only be a good thing.

waruqi commented 2 years ago

xmake is very friendly to other build systems and does not force us to just use xmake to maintain the entire project. We can locally integrate source code project maintained by third-party build systems, such as cmake, meson or GN.

For example, integrating a cmake-maintained project locally to achieve mixed compilation. https://xmake.io/#/package/local_3rd_source_library?id=integrate-cmake-source-library

It can also support local or remote integration of a GN-maintained library, such as integrating the skia library maintained by GN: https://github.com/xmake-io/xmake-repo/blob/master/packages/s/skia/xmake.lua

Of course, compared to GN, xmake currently has a more complete integration of cmake/meson, if you are interested in using xmake. I will further improve the integration support for the GN project.

waruqi commented 2 years ago

I see that there are not many dependent libraries for node. Whether or not xmake is considered, I will try to add all the dependencies to the official xmake package repository, such as the uvwasi package I am adding, which is not very troublesome: https://github.com/xmake-io/xmake-repo/pull/964/files

jviotti commented 2 years ago

We (Postman) are working on a project that integrates Node.js as part of a larger project (for macOS, GNU/Linux and Windows), which led us to feel quite some pain at integration time given the current setup.

Reading through this thread and other related threads, most of the suggestions (and counter-arguments) revolve around which meta build system to adopt long-term: i.e. GN, CMake, Bazel, GYP, Xmake, etc. I think its interesting to look at the problem from a non-meta-build-system point of view too.

In particular, I think Ninja itself (https://github.com/ninja-build/ninja) offers a perspective that might be better aligned to Node.js than all the other proposed alternatives from a philosophical point of view:

So my proposal is:

nornagon commented 2 years ago

@jviotti FWIW I don't think in Electron we would easily be able to import Ninja definitions directly. GN expects all build configuration to be in GN, not Ninja, and as far as I know has no support for depending on raw Ninja definitions generated by other tools.

frank-dspeed commented 2 years ago

i am working at present on a many related projects like a fuchsia nodejs runner i come to the final conclusion why does NodeJS not build it self via NodeJS?

I am also working on language Imlementations like typescript and php on GraalVM and out of Language Implementer view every good language builds it self why not NodeJS? in graalvm nodejs is also handled as language that is compile able to single binary this way i got successfull runs on fuchsia os + fidl.

but the graal-node method does not use v8 so it is clear why that is simpl.

i am also working on the concept of a v8 embedder framework that allows to build apps directly against v8 a small linux only implementation of that can be seen in that project : https://github.com/just-js/just/

the resulting applications bound directly to the linux epoll are outperforming nodejs by up to 100 ranks it is first place in the composit score of a very reputable benchmark: https://www.techempower.com/benchmarks/#section=data-r21&test=composite

so outperforming everything else.

At present my main Project is to get a just-js like sdk done for fuchsia that easy allows to directly bind the fidl client into v8 with bundled JS as glue code my PoC is using the fuchsia.http.Loader and current state is to add a fuchsia based http server implementation. Using the Cloudflare Rust version of the the Quic Bindings.

As also a sideEffect of this is a nodejs Implementation of the Fuchsia Component Manager for Mocking but at the end as a side Project i am Porting all Fuchsia Concepts to NodeJS and similar Platforms as they have overlapp by design and this are low hanging fruits.

cc: @hashseed

eli-schwartz commented 2 years ago

why does NodeJS not build it self via NodeJS?

I am also working on language Imlementations like typescript and php on GraalVM and out of Language Implementer view every good language builds it self why not NodeJS?

I'm not sure how you define "every good language" but I feel like I'd be hard pressed to find many languages where the build system is written in the language's own final runtime.

There's an obvious bootstrap issue there, so for starters if the runtime itself isn't written in that programming language, it seems highly strange to write the build system in it.

Also investing in a custom build system instead of an existing one is a bit of a hard sell to begin with. An (IMHO weak) argument could be made in favor of taking a language-specific build system + package manager that exists anyway and using it to compile the self-hosted language runtime. But this is very much not the case here anyway, because nodejs.exe's source code isn't written in nodejs!

frank-dspeed commented 2 years ago

@eli-schwartz i understand your point but i will go for a ECMAScript Build / Make tool anyway and will use all needed workarounds to get that working eg: ffi magic with sharedArray Buffers and the memory location of them.

I guess i can even translate gn and other build tools relativ simple and even use the original source files.

then i guess the following will happen.

People will find the Project and use it anyway because it works and then we call it a day.

Then NodeJS can Build It Self via a Older Version from it self that works. Using the Language its written in (C)

NodeJS is the successor over Python as Python will not upgrade and python did nothing diffrent also Xmake with the Lua Runtime does nothing diffrent.

Ok NodeJS is maybe a bigger runtime but who cares it offers also more features out of the box. Combined with Typescript i guess this is a selling point including the Typedefinitions.

Also the tooling will get bigger to translate and express Native Stuff in JS and so generate the files needed for the other build tools so doing what GN does and hand over to Ninja.

All build systems share more or less a big common set of features it will be easy to do mappings.

For Performance there is a clear migration path

ffi-napi: A successor to node-ffi compatible with modern versions of Node.js. sbffi: This library. napi-addon: A very simple/normal Node.js addon using NAPI in C. napi-addon-sb: A NAPI addon using the same shared-buffer technique as sbffi, but with a hard-coded function call, rather than a dynamic/FFI call. wasm: The adding function compiled to WebAssembly. js: Re-implementing the function in plain JavaScript.

Each function will be called 100000 times, in 1 repetitions, timed with console.time(). Here are the results on my machine (2019 Lenovo X1 Extreme, running Ubuntu, Node v12):

ffi-napi: 1419.159ms
sbffi: 29.797ms <= Easy archive able
napi-addon: 3.946ms
napi-addon-sb: 3.717ms <= Easy archive able
wasm: 0.871ms <= sometimes easy archive able
js: 0.090ms <= sometimes easy archive able

so there is no question combined with multi TH Support and the NodeJS VM Module for example we can build maybe the fastest XMake ever

eli-schwartz commented 2 years ago

I'm not sure what point you're trying to make?

A build system has two components. A configuration handler, and a command runner.

Common configuration handlers are: meson, cmake, autotools ./configure, gn, gyp

Common command runners are: make, ninja, msbuild, xcodebuild

Some conflate the two: waf, scons, setup.py -- these typically tend to be a lot slower than anything else, as they need to reparse the build configuration, and everything is basically a cold build.

Generating ninja files is a good idea, and meson, cmake, gn, and gyp all do this (either exclusively or as one of several options). You will generally not get faster than running ninja which is heavily optimized for speed and doing as little work at build time as possible. Reimplementing an excellent tool in JavaScript for reasons you haven't really explained doesn't seem to make any sense and I can't possibly overstate how bad an idea I think this is.

Writing a build system in JavaScript that generates ninja files can be done, I guess, but you haven't really explained why that's the superior idea either. It's not something I'd be wary of to the same level as reimplementing ninja, but...

Honestly? Speed for the initial build configurator doesn't seem like the best thing to focus on. There's a bunch of reasonably fast options, and your bottleneck will basically always come from the time spent executing subprocesses such as compilers, system info scraping tools, and suchlike. Compiler tests are the big time hog in any build system as you're totally dependent on the compiler itself. The good thing is that once you did this all once, you don't need to do it again as you have the configuration saved to disk and now you just need to run the command runner (like ninja).

What speed goals are you trying to hit with nodejs build systems?

NodeJS is the successor over Python as Python will not upgrade and python did nothing diffrent also Xmake with the Lua Runtime does nothing diffrent.

I'm not sure what you're trying to get at with this language war, but I think you've misunderstood something about both Python and Lua.

waruqi commented 2 years ago

Why not consider xmake? xmake has the same build speed as ninja, it's very fast and it doesn't regenerate the makefile/ninja.build file due to configuration changes like cmake/autoconf/meson, which is very time consuming.

Generally, it is very fast to build projects directly with xmake, which also supports parallel builds and has built-in cross-platform build caching optimisations, similar to ccache but with support for msvc. There is also built-in support for distributed builds, and remote build caching. There is also support for unity build to further speed up compilation.

If you use ninja directly, it is difficult to use distributed builds to further speed up builds, and using the external ccache, which does not support msvc, has a number of limitations.

There are also commands in xmake that allow us to generate build files such as ninja.build/makefile/vs/cmakelists. If we use xmake, we can still switch to ninja to build at any time.

Finally, xmake also provides built-in package management and also supports packages directly from any other package manager such as conan/vcpkg/brew/apt/cargo/pacman/dub. In theory, we are able to directly integrate over 90% of packages using the c++ ecology, as long as any of the repositories have it.

waruqi commented 2 years ago

@eli-schwartz i understand your point but i will go for a ECMAScript Build / Make tool anyway and will use all needed workarounds to get that working eg: ffi magic with sharedArray Buffers and the memory location of them.

I guess i can even translate gn and other build tools relativ simple and even use the original source files.

then i guess the following will happen.

People will find the Project and use it anyway because it works and then we call it a day.

Then NodeJS can Build It Self via a Older Version from it self that works. Using the Language its written in (C)

NodeJS is the successor over Python as Python will not upgrade and python did nothing diffrent also Xmake with the Lua Runtime does nothing diffrent.

Ok NodeJS is maybe a bigger runtime but who cares it offers also more features out of the box. Combined with Typescript i guess this is a selling point including the Typedefinitions.

Also the tooling will get bigger to translate and express Native Stuff in JS and so generate the files needed for the other build tools so doing what GN does and hand over to Ninja.

All build systems share more or less a big common set of features it will be easy to do mappings.

For Performance there is a clear migration path

ffi-napi: A successor to node-ffi compatible with modern versions of Node.js. sbffi: This library. napi-addon: A very simple/normal Node.js addon using NAPI in C. napi-addon-sb: A NAPI addon using the same shared-buffer technique as sbffi, but with a hard-coded function call, rather than a dynamic/FFI call. wasm: The adding function compiled to WebAssembly. js: Re-implementing the function in plain JavaScript.

Each function will be called 100000 times, in 1 repetitions, timed with console.time(). Here are the results on my machine (2019 Lenovo X1 Extreme, running Ubuntu, Node v12):

ffi-napi: 1419.159ms
sbffi: 29.797ms <= Easy archive able
napi-addon: 3.946ms
napi-addon-sb: 3.717ms <= Easy archive able
wasm: 0.871ms <= sometimes easy archive able
js: 0.090ms <= sometimes easy archive able

so there is no question combined with multi TH Support and the NodeJS VM Module for example we can build maybe the fastest XMake ever

js may be fast, but the performance bottleneck in building systems comes mainly from the compiler. The performance of compilers can only be improved if they are better scheduled, e.g. parallel compilation, distributed compilation, build caching, Unity Build, etc.

Switching to js doesn't improve the performance of the build fundamentally, but instead you waste a lot of time reimplementing the build system and find that it doesn't perform as well as make, because the build system is very complex and there are a lot of details to consider. I don't think it makes sense to build a new build system just to build your own project, rather than using an existing mature build system.

Also, xmake can use both lua runtimes, luajit and lua, and luajit is much faster than lua, but I have tested many projects and found that they are basically the same for build performance, luaKit doesn't make the build much more efficient, and I think switching to js would be the same.

eli-schwartz commented 2 years ago

Why not consider xmake? xmake has the same build speed as ninja, it's very fast and it doesn't regenerate the makefile/ninja.build file due to configuration changes like cmake/autoconf/meson, which is very time consuming.

Regenerating Makefile/ninja.build is not time-consuming. Reconfiguring on changes to the build system configuration may or may not be time-consuming, due to needing to re-run configure-time actions (some of which are cached, though).

It's entirely possible that xmake is very fast at processing the configuration, though!

and has built-in cross-platform build caching optimisations, similar to ccache but with support for msvc. There is also built-in support for distributed builds, and remote build caching. There is also support for unity build to further speed up compilation.

Writing your own builtin version of ccache/distcc actually sounds bad. In fact, it causes me to remember a recent xmake bug report in which xmake was erroneously failing to emit warnings for cached objects -- ccache is well designed to be equivalent to actually running the compiler and covers a wide variety of edge cases, and therefore among other things emits the same warnings that actually running the compiler would emit.

ccache and distcc are well tested by a vast number of people, your builtin comparable features probably are not. So even projects that use xmake should, IMO, use e.g. CC="ccache gcc" and sidestep the homebrew caching.

Finally, xmake also provides built-in package management

cool

and also supports packages directly from any other package manager such as conan/vcpkg/brew/apt/cargo/pacman/dub. In theory, we are able to directly integrate over 90% of packages using the c++ ecology, as long as any of the repositories have it.

But this is just what literally every build system does, even the shell scripts that run hardcoded gcc $(pkg-config --cflags --libs libfoodep) myprog.c -o myprog, so I'm not sure that saying "xmake supports third-party package managers" provides significant insight into the state of the art in build systems?

waruqi commented 2 years ago

And distributed builds aren't exactly a unique feature, people have been using distcc successfully for a long time.

right, but distcc does not support msvc and remote build cache. Also it has not been updated for a long time and does not support load balancing scheduling between nodes.

In fact, it causes me to remember a recent xmake bug report in which xmake was erroneously failing to emit warnings for cached objects

xmake has only recently supported cache, so it may not be as mature as ccache, but I will continue to improve it. This issue has been fixed quickly before and I believe that in a few releases it will be more or less as stable as ccache.

I don't think it's a big problem, you'll encounter some issues with any tool, even with ccache. As long as the maintainer can fix it quickly.

Also, even with xmake, we can still quickly use ccache and distcc, which do not conflict

waruqi commented 2 years ago

But this is just what literally every build system does, even the shell scripts that run hardcoded gcc $(pkg-config --cflags --libs libfoodep) myprog.c -o myprog, so I'm not sure that saying "xmake supports third-party package managers" provides significant insight into the state of the art in build systems?

We just need to configure

add_requires("conan::zlib 1.2.12")
add_requires("vcpkg::zlib 1.2.12")

and you can quickly switch between different package managers to use them.

xmake automatically calls vcpkg/conan to install packages and then automatically integrates links/includedirs. it is not as simple as just calling pkg-config to find packages.

The user doesn't need to care about using conan and vcpkg, xmake will do everything.

If you use packages with xmake's built-in package manager, it can also provide even more useful features.

you can see https://github.com/xmake-io/xmake/wiki/Xmake-and-Cplusplus--Package-Management

frank-dspeed commented 2 years ago

My point was about reducing Learning Curve and improve flexible builds.

I do not expect general more speed while executing but i expect Many More coders who could improve the build.

I hate it since years that every one used python because its easy to package a .so file and do a ffi.

I can now code polyglot and i know what i am doing that took me only 30 years. We can save other peoples time.

eli-schwartz commented 2 years ago

The plan is to steer away from custom solutions (like we currently have). That might mean an existing, well maintained toolchain that uses python

I think the discussion about the future of Python is atleast somewhat derailing from the matter at hand: irrespective of the future of Python, we should investigate an alternative to gyp-next.

@frank-dspeed I don't see the point of quibbling over some hatred of python, when the use of python in a build system has no effect on the learning curve of nodejs because it doesn't affect anyone, not even the people who build nodejs (because they will run a CLI tool and the language it is written in is totally irrelevant, whether it is written in python or ruby or golang or malbolge).

Improving the build would happen inside the build files. For example, by editing CMakeLists.txt (written in a language called cmake, not C++, even though CMake is written in C++), or by editing meson.build (written in a language called meson, even though Meson is written in python).

(Admittedly xmake seems to break this pattern by using lua files. That being said, the common syntax its docs recommend doesn't have any reliance on the lua language.)

frank-dspeed commented 2 years ago

@eli-schwartz maybe you see it out of a other view but since nodejs exists i got build problems with it on diffrent devices and platforms and arch thats nothing new and i needed to learn a lot to solve any of that issues and to even understand how all this systems work.

When we would code the most parts in JS we would create something flexible that a JS Developer could maintain and we would even teach him how to integrate with other lanaguges and software.

As you mentioned already every build system introduces also additional a DSL Domain Specific Language that your free to learn on top :D so i see no point in learning all that details.

Thats my point we can reduce a lot of complexity here and get it right once and for all as we can Easy create good readable Conditional code when we write that build Logic in ESM or if we would even use something like gulp and add if missing plugins.

Everything that reduces layers makes it also later more easy to switch to other build systems without changing as much code because we can then use all the JS Stuff like Injection and Loader Interception patterns.

waruqi commented 2 years ago

Whether we use python or nodejs, building systems that depend on external runtimes can introduce more uncertainty and problems. Even if a previous project can be compiled successfully, a python version update that introduces a bug can break meson/scons. like this case: https://github.com/xmake-io/xmake-repo/issues/1013

but xmake has a lightweight builtin lua runtime. So if the current xmake version is stable, it will at least not break xmake builds due to lua runtime version updates and stability issues.

frank-dspeed commented 2 years ago

@waruqi and we would have a light lightweight v8 runtime https://github.com/just-js/just/i as light as that for example but with v8 builds for more platform arch combos

i already use JS for system programming and it is the best choice i ever took in the last years the v8/JS/ECMAScript event loop replicates exactly what Zircon (FuchsiaOS Kernel) Does and in general it matches what i do in the cloud. I know out of expirence that every big Cloud Architecture is using the event bus pattern and all this concepts are not new.

but today we see that they are working. I am a big fan of the Overall Movement of the Ecosystem but there are some blockers like the idea of reusing existing stuff and fear of reinventing it.

All that systems got designed out of a 1970 or maybe older view its time to change that. Today we got CPU's with 128+ cores thats simply something total diffrent then it was 10 years back.

frank-dspeed commented 2 years ago

just-js by the way ranks 1 in a well designed benchmark suite that tests frameworks https://www.techempower.com/benchmarks/#section=data-r21&test=composite

i can tell you this is amazing my build speed is nice and the results are stable also all this is at last fast understandable!

eli-schwartz commented 2 years ago

Thats my point we can reduce a lot of complexity here and get it right once and for all as we can Easy create good readable Conditional code when we write that build Logic in ESM or if we would even use something like gulp and add if missing plugins.

Everything that reduces layers makes it also later more easy to switch to other build systems without changing as much code because we can then use all the JS Stuff like Injection and Loader Interception patterns.

In fact, by reducing layers you make it harder to switch to other build systems, and simultaneously move all complexity into your own project instead of letting a dedicated "build system" project handle the complexity for you.

This has been specifically rejected in this ticket. In fact, it's the entire point of the ticket.

The plan is to steer away from custom solutions (like we currently have).

It's eminently reasonable to want to avoid custom stuff you don't specialize in, and the nodejs team specializes in javasacript runtimes, not cross-platform build systems.

i already use JS for system programming

Build systems are not system programming. If build systems were system programming, the nodejs team would write their own build system where all configuration is done in config.cpp.

I am confident this is not what you want.

frank-dspeed commented 2 years ago

@eli-schwartz i guess i understand now where your view comes from but i do not agree while i know many people will agree with you at last it does not matter the future is clear at last for me and i see the whole ecosystem Polyglot moving into the right direction i see overlap in functionality everywhere and i see a OS Getting it finaly right via simply offering a Universal ABI to hardcode interfaces.

All this concepts solve so many problems that they will get adopted it is only a question of time and i only wanted to speed up the overall process at the end it will never matter for the future what the nodejs core project does or does not do. There is a reason why NodeJS gets forked and forked all over and it is not contributing back to NodeJS Core. There are Fundamental Problems that the People will address and they will sideStep RFC Processes as they got simply the ability to do so.

thats all i have to say. At the end You and me are Correct at the same time:

ryzokuken commented 2 years ago

I need to dig deeper into the replies here, but just FYI, I don't think we're considering switching from ninja or make here. We're just considering switching from gyp-next to a better meta build system.

Another important thing to note is that speed is not our biggest priority here IIUC, our biggest priority is cutting down the work needed to:

  1. Maintain our own bespoke meta build system in Python.
  2. Translate V8's GN build files into gypfiles.

And while there's arguments to be made in favor of writing and maintaining our own in JavaScript, I don't believe it would make our lives much easier as compared to the current situation but will come at a huge cost.

frank-dspeed commented 2 years ago

@ryzokuken do no worry about that once i am done i will do a PR if you like it take it if not it is total ok

I am Coding a Whole Kernel in ECMAScript so i need some tooling to do custom v8 builds and optional other builds anyway as part of that i will come up with something that is gyp compatible and a drop in replacement.

I will take as much of the XMake implementations as needed. and incremental come up with own ways.

I did code a lot of parser stuff in ECMAScript already parsing the lua code of XMake to ECMAScript is a nobrainer. That should be low Hanging Fruits.

waruqi commented 2 years ago

we would have a light lightweight v8 runtime https://github.com/just-js/just as light as that for example but with v8 builds for more platform arch combos

@frank-dspeed I saw it, I don't think it's lightweight, as long as it still relies on v8, because no matter how you trim v8, it's still heavyweight. it's still 50M, and the whole binary of just after linking is 17M.

But the whole xmake binary is only 1M. If lto is turned on, it is only 670K.

I did code a lot of parser stuff in ECMAScript already parsing the lua code of XMake to ECMAScript is a nobrainer. That should be low Hanging Fruits.

I don't really think it's necessary to implement a build system based on js for this, and right now just-js only supports linux.

frank-dspeed commented 2 years ago

@waruqi your partial correct just js out of the box does static linking so the resulting binary can be smaller when we do dynamic linking like XMake does overall i guess you get the situation a bit wrong let me explain more.

There is a reason why it is Linux only because it made it simple there is no technical reason for that. The Concepts are Portable. In Fact it is already a Compose able Linux Only NodeJS there are repos that hold all neeeded modules for that. and it is easy to add more modules as this is simply C and Some Conventions to build v8

thats it!

NodeJS Example

  1. Download Source
  2. run our own make script that builds v8 or downloads prebuild
  3. build our build environment (creating some C code not much that throws our .js files into the v8::isolate) using fast v8 ffi calls to integrate into the Host system v8 includes a own FFI Call Engine so that no round trips are needed to the host environment that embeddeds v8 (Filesystem, network access, exec .....)
  4. build NodeJS with the same v8

Be Happy! No Extra Dependencies no Extra waisted Space as reused the same v8 in this scenario

eli-schwartz commented 2 years ago

I do not understand why we are 21 comments deep into a discussion about taking a bespoke python meta-build system and basically bespoke gyp meta-build files and... reimplementing it in bespoke javascript...

... when the topic of this discussion is about moving away from bespoke build systems on the grounds that it "doesn't make things better, but does add a huge cost".

Writing a nodejs-specific build system in any language does not sound like it's on the table. Writing one that depends on having a previously built copy of nodejs or v8 is just additonal pain by way of bespoke "own make script that builds v8", which is exactly the thing that was declared to be a problem to be moved away from.

I think it probably makes quite a bit of sense to stop talking about something that's known to be a non-goal and will not happen.

Hi-Angel commented 1 year ago

We haven't reached a decision, so it's not late

@mmarchini is it now? If not, I'd in turn suggest looking into Meson. I find it odd Meson is mentioned everywhere in the discussion, however nobody has put a suggestion for it.

I never worked with XMake, GN and Bazel, so can't say anything useful here. However, I see there is a suggestion for CMake, and I can tell for sure Meson is better. Basically, you can do all the same things, however it's more automated, and it has waaay nicer syntax which is easier to read (IMO that's the most important difference, meson.build files a way more understandable than CMakeLists.txt ones). Also, it has an implicit type-system, which allows to give much more useful error messages when one screws up something in the build system files.

Meson is very popular, these days it's used by lots of well known projects: Gnome-related ones, Mesa graphics drivers, XServer, libinput, i3wm, etc… Mesa is an interesting user, because it has lots of code-generators, they have different langs (e.g. there's an assembly code, which has to be compiled without lto, otherwise it won't link), so I'd say they use it in non-trivial ways.

frank-dspeed commented 1 year ago

@Hi-Angel I guess your correct at last i am working on a ECMAScript written MESON implementation already as the syntax is easy parse able and transpile able. But i also throw GN into the Mix and then the user can choose what he wants to use to build his NodeJS Distrubtion.

We are in a good time periot where you can transpile anything fast. GN, MESON that makes not much difference or Python (Gyp).

At the end the only importent thing is feeding text into gcc or cmake or any esoteric compiler stack that you choose and call it a day.

eli-schwartz commented 1 year ago

i am working on a ECMAScript written MESON implementation already as the syntax is easy parse able and transpile able.

That's super cool. Also, we (Meson) have intentionally designed the syntax to be capable of reimplementing in other languages without undue difficulty, there's a FAQ entry about it even. :) So I'm happy to hear that that is working out for you.

Whenever you feel it's ready, I'd love to take a look at your implementation. We could list it in the Meson FAQ as well. Currently we list a couple alternative implementations of Meson, the most advanced one being a c99 version.

feeding text into gcc or cmake or any esoteric compiler stack that you choose

CMake isn't a compiler stack! :p

But yes, indeed, build systems are really just a pile of conventions for calling a compiler, with a few frills added on at the end.

frank-dspeed commented 1 year ago

ah sorry cmake is always automaticly assigned with llvm some how in my head