Open victorb opened 5 years ago
If this solves this issue I am all for it https://stackoverflow.com/questions/48287776/automatically-build-npm-module-on-install-from-github
I just want to be able to fork a js module make any changes I need to open a PR to the upstream library and use my forked branch while I wait for that lib maintainer to get it merged. Currently this the most annoying process I have ever seen (I come from native mobile dev background) and there is no solution just a bunch of hacks
The scripts
section is currently used for a lot of different use-cases. The most common in my experience is, starting different development environments, building, linting, various testing scenarios and starting the app (while setting environment variables). But it's often also used to move/copy files, git versioning (git tag/push and npm publish) and post-installation setup. They also often depend on one on another.
I think removing scripts
will require another mechanism for all of the above use-cases.
In general, a npm package falls in either of these categories:
Not having to pre-build a package would be great but requires cooperation of transpile tools like Babel/webpack/rollup to be successful.
Number 4 is a direct consequence of ~build tools not being able to build source code of dependents~ - see comment https://github.com/open-services/open-registry/issues/16#issuecomment-487006942 .
I would also like to add that JS libraries also falls in different categories.
Last comment ;)
The real flexibility of scripts
is that the path to a binary of all modules are auto provided, npm wraps common shell programs cross-platform, every key/value in all of package.json are available and there is a single terminal CLI to remember; npm run
/yarn run
/etc.
And it's nice that all project scripts are gathered in one place.
Not having to pre-build a package would be great but requires cooperation of transpile tools like Babel/webpack/rollup to be successful.
Number 4 is a direct consequence of build tools not being able to build source code of dependents.
Rollup and webpack are able to compile dependents if they provide the non-standard module
key in package.json.
A great collection of resources are found here: https://stackoverflow.com/q/42708484/205696
ASFAIK This not true for Babel. Not sure about parcel.
At this point it's important to make the distinction between bundlers (webpack, parcel, rollup, etc.), transpilers (Babel, typescript CLI, coffeescript CLI, etc.) and task runners (grunt, gulp, make, etc).
While bundlers should know how to compile dependents, it's not clear if transpilers should. Task runners are by nature open to do what you want them to do and the npm client kind of falls in the task runner category.
My number 4 point comes from a use-cases I have right now (but I've used scripts
plenty of other times to copy/move files). We're are using a paid third party library, PSPDFKit, in different apps. On top of PSPDFKit we have build some common code that needs to run in all of our apps.
One app is a php Laravel app and the other is a PWA that runs in either Electron or a browser, with nodejs as backed when running in browsers.
The issue is that the Laravel app uses a black box docker image to get the client side browser PSPDFKit code and the PWA uses a npm package, which is transpiled with the project. In the Laravel app we can not import PSPDFKit but include it as a script file in the browser. In the PWA we import PSPDFKit into the project and build it together.
The issue is that we can not create a build, for our common code, that both uses PSPDFKit as an external lib and at the same time imported into the bundle.
So we share the code as a npm package and copy it into the 2 apps source code directories. The 2 apps then have different build processes where one does not bundled PSPDFKit but the other one does.
Excuse my naivete. I see reproducible builds as a combination of a build environment (e.g. a docker image hash) and complete lack of network access during the build. Will that be feasible?
@gritzko If you can narrow down, what constitute a build, then yes. Since build dependencies usually have daily updates, you would need daily docker images if you want the latest bug/security fixes. That is a pretty big burden, both for developers to maintain but probably also to host.
ls -g node_modules/ | wc -l
in a current project I'm working on gives me 751. This is counting only first level dependencies. ls -g node_modules/@babel/ | wc -l
for instance, gives me 72 dependencies under the @babel
namespace.
I think a less ambitious scope will be more likely to succeed.
Perhaps reproducible builds for node.js addons would be insteresting to pursue?
@gritzko I assume you mean something like this for building with emscripten: https://developers.google.com/web/updates/2019/01/emscripten-npm
I really think it's a great idea to have a DSL for building modules. It would make my life easier.
But I fear what "get rid of the scripts field in the package.json" will entail...
I've been wondering for a bit if we can't get rid of the scripts field in the package.json (only real use case for that seems to be building cross-platform binaries) and instead offer some DSL for providing instructions to a build server.
One worry today with the npm registry is the disconnect between what's on the registry and what's in the source. Sometimes people use steps inbetween publish and source that are hidden.
I've been wondering for a bit if we can't get rid of the
scripts
field in the package.json (only real use case for that seems to be building cross-platform binaries) and instead offer some DSL for providing instructions to a build server.So when you publish, you wouldn't do
build > publish
but rather just tell the build server to pull down the latest copy and run the build in a isolated environment.We could provide cross-platform builds for the community, and possibly also a testing server. All run by and provided for the community.