Is your feature request related to a problem? Please describe.
In the JavaScript ecosystem, there are many problems that could, in theory, be fixed with custom user-defined module resolution configuration.
Aliasing a module
When aliasing a module, you must tell every tool that you use how to find an aliased module. For example, to switch a project from React to Preact (a high-level, common use-case), you must configure Webpack, Typescript (or alternatives e.g. Flow) if you use types, Webpack (or alternatives e.g. Snowpack) you use for bundling how to find the module, *Eslint** (or alternatives) on how to find the new module so you don't get IDE errors.
This divide in the ecosystem makes aliasing as a whole, very unfriendly and prone to errors.
Ecosystem unity
When attempting to import a resource, every tool much agree on how to find it (as mentioned previously). If you use svgr, a library to turn SVG into React components (a high-level, relatively common problem), every tool must agree on how to import it. If you use Typescript, you must create .d.ts files so that correct intellisense and typing can be provided. If you use a bundler such as webpack, you must tell it to use a loader to transform the svg files so that they can be bundled.
Better monorepo support
Attempting to make monorepos in JS (i.e. multiple projects sharing code) requires immense amounts of effort, and is not at all comparable to the beautiful experience you get from languages such as Rust or C#. In my experience, getting monorepos to work require cooperation from the package manager (which Yarn thankfully supports) and cooperation from your tooling. In my specific usecase, I have my source code in a src directory with package.json outside of it. By utilizing symlinks to point to the directory containing the package.json, I had to add /src to my import path to get tooling such as Typescript and Webpack to recognize where the code was. Despite attempting to use export maps, this is not supported.
Solving node_modules
With the ability to override the module resolution algorithm, node_modules would no longer need to be a painpoint for developers.
In today's world, node_modules is a source of frustration for many developers. It is very IO heavy to setup, it wastes a lot of space for a developer, and it can get out of sync with dependencies listed in package.json. By allowing the module resolution algorithm to be configurable, it would allow massive improvements to the ecosystem as a whole, as individual solutions to the problem could be made to work with the module resolution algorithm, and thus with any and all infrastructure that support the new configurable module resolution algorithm.
For example, Yarn PnP is a solution to the node_modules problem, but lacks the ecosystem-wide compatibility to take off. Projects such as pnpm aren't perfect solutions but it does tackle the flattening of node_modules by utilizing symlinks in a way that node, and other projects that properly implement the node module resolution algorithm can find the modules. However, this is not without its drawbacks, as existing infrastructure that improperly implements the node module resolution algorithm will break if the modules are not as it expects.
Describe the solution you'd like
My suggestion to this problem is to have a .resolve.js file inside of node_modules that will execute to resolve the path for a module, that yields a single export with a function that, given the current script path and module name to be found, will yield a promise to the path to another path that is resolvable. For example, consider the following structure:
This solution offers 100% customizability of the code being required.
Describe alternatives you've considered
Ideally, a solution that is easily statically analyzable and thus easily implementable in languages without requiring a NodeJS runtime would be most ideal. However, due to the possible unforeseen innovations that may be created and require custom functionality AND requirement of other vendors implementing the changes Node has (which we cannot expect vendors to do), it would be the simplest and most ecosystem-wide compatible have a JS file that executes to automatically resolve the paths of modules.
The asynchronous nature of the require code is so that asynchronous calls can be made rather than synchornous ones (i.e. fs.readFile vs fs.readFileSync), but given that require is synchronous this may have to be altered.
The reasoning for .resolve.js to be inside node_modules is that it's a file that shouldn't be committed - rather, package manages such as yarn or npm could auto generate this file. Perhaps it might be better to include it next to package.json so it does get committed however, as the current tooling environment doesn't seem very co-existant and other projects would likely require their own boilerplate to be inserted.
The path to the current script being an argument to the function is suggested so that implementing custom node_modules algorithms can be context aware about the caller. The usecase in mind that this would be useful for, is proper versioning support of modules. Consider the following scenario, where a project looks as such:
The dependency depends-on-is-number-6 could make a call to require("is-number"), and .resolve.js could yield ./is-number/6.0.0/index.js to NodeJS and thus maintain a flat node_modules structure. A more realistic example, would be a package manager that stores several versions of their packages in a cache, and redirecting imports to the correct location.
Overall, there's a lot of remaining bike-shedding to be done since this is something especially important to get right for ecosystem unity, however it'd all be for naught if this idea is deemed unworthy.
Is your feature request related to a problem? Please describe. In the JavaScript ecosystem, there are many problems that could, in theory, be fixed with custom user-defined module resolution configuration.
Aliasing a module When aliasing a module, you must tell every tool that you use how to find an aliased module. For example, to switch a project from React to Preact (a high-level, common use-case), you must configure Webpack, Typescript (or alternatives e.g. Flow) if you use types, Webpack (or alternatives e.g. Snowpack) you use for bundling how to find the module, *Eslint** (or alternatives) on how to find the new module so you don't get IDE errors. This divide in the ecosystem makes aliasing as a whole, very unfriendly and prone to errors.
Ecosystem unity When attempting to import a resource, every tool much agree on how to find it (as mentioned previously). If you use svgr, a library to turn SVG into React components (a high-level, relatively common problem), every tool must agree on how to import it. If you use Typescript, you must create
.d.ts
files so that correct intellisense and typing can be provided. If you use a bundler such as webpack, you must tell it to use a loader to transform the svg files so that they can be bundled.Better monorepo support Attempting to make monorepos in JS (i.e. multiple projects sharing code) requires immense amounts of effort, and is not at all comparable to the beautiful experience you get from languages such as Rust or C#. In my experience, getting monorepos to work require cooperation from the package manager (which Yarn thankfully supports) and cooperation from your tooling. In my specific usecase, I have my source code in a
src
directory withpackage.json
outside of it. By utilizing symlinks to point to the directory containing the package.json, I had to add/src
to my import path to get tooling such as Typescript and Webpack to recognize where the code was. Despite attempting to use export maps, this is not supported.Solving node_modules With the ability to override the module resolution algorithm,
node_modules
would no longer need to be a painpoint for developers.In today's world,
node_modules
is a source of frustration for many developers. It is very IO heavy to setup, it wastes a lot of space for a developer, and it can get out of sync with dependencies listed inpackage.json
. By allowing the module resolution algorithm to be configurable, it would allow massive improvements to the ecosystem as a whole, as individual solutions to the problem could be made to work with the module resolution algorithm, and thus with any and all infrastructure that support the new configurable module resolution algorithm.For example, Yarn PnP is a solution to the node_modules problem, but lacks the ecosystem-wide compatibility to take off. Projects such as pnpm aren't perfect solutions but it does tackle the flattening of node_modules by utilizing symlinks in a way that node, and other projects that properly implement the node module resolution algorithm can find the modules. However, this is not without its drawbacks, as existing infrastructure that improperly implements the node module resolution algorithm will break if the modules are not as it expects.
Describe the solution you'd like My suggestion to this problem is to have a
.resolve.js
file inside ofnode_modules
that will execute to resolve the path for a module, that yields a single export with a function that, given the current script path and module name to be found, will yield a promise to the path to another path that is resolvable. For example, consider the following structure:Running
node /index.js
would roughly lead to the following code being executedThis solution offers 100% customizability of the code being required.
Describe alternatives you've considered Ideally, a solution that is easily statically analyzable and thus easily implementable in languages without requiring a NodeJS runtime would be most ideal. However, due to the possible unforeseen innovations that may be created and require custom functionality AND requirement of other vendors implementing the changes Node has (which we cannot expect vendors to do), it would be the simplest and most ecosystem-wide compatible have a JS file that executes to automatically resolve the paths of modules.
The asynchronous nature of the require code is so that asynchronous calls can be made rather than synchornous ones (i.e.
fs.readFile
vsfs.readFileSync
), but given thatrequire
is synchronous this may have to be altered.The reasoning for
.resolve.js
to be insidenode_modules
is that it's a file that shouldn't be committed - rather, package manages such asyarn
ornpm
could auto generate this file. Perhaps it might be better to include it next topackage.json
so it does get committed however, as the current tooling environment doesn't seem very co-existant and other projects would likely require their own boilerplate to be inserted.The path to the current script being an argument to the function is suggested so that implementing custom node_modules algorithms can be context aware about the caller. The usecase in mind that this would be useful for, is proper versioning support of modules. Consider the following scenario, where a project looks as such:
The dependency
depends-on-is-number-6
could make a call torequire("is-number")
, and.resolve.js
could yield./is-number/6.0.0/index.js
to NodeJS and thus maintain a flatnode_modules
structure. A more realistic example, would be a package manager that stores several versions of their packages in a cache, and redirecting imports to the correct location.Overall, there's a lot of remaining bike-shedding to be done since this is something especially important to get right for ecosystem unity, however it'd all be for naught if this idea is deemed unworthy.