Closed bmeck closed 7 years ago
FWIW I am discussing an always asynchronous module.import('anything')
that would solve half of the issues raised in here as opt-in, and I also agree require
should be kept synchronous.
Just my 2 cents
Initial implementation discussion continues in https://github.com/nodejs/node/issues/11233.
.mjs
is a bad idea. The standardized version should be .js
.
Counter-proposal: ES6 modules are a separate world from commonjs modules:
Bridges are explicit and synchronous: import {require} from 'commonjs'
and module.import('some-es6-module')
.
When you want to profit from ES6 static guarantees, use and write code that has them.
Everything is still .js
, clear error messages tell you that you just tried to use the wrong import mechanism.
@flying-sheep the sole purpose of a file extension is to define how to parse a file. Modules are a new parse goal, thus they need a new file extension - otherwise why have extensions anywhere?
@ljharb sure, but in this case the extension doesn't define the file purpose: a JavaScript module.
I just wonder: if the direction is to load new modules asynchronously, what's the real issue in taking "all the time you need" to pre-fetch the target file and statically understand if that's using ES6 import / exports or the module
global in it?
I fully understand this is nothing suitable for require
but ... who cares? Why cannot CommonJS implement a promise based module.import(path):Promise
that once the top async behavior will be in place would make things as simple as it is today for require?
In that case, the import is resolved only once the module kind has been defined, and the right mechanism to import it applied.
After all, we know the extension has never been a good guard for file types, so that everything needs to be fetched first, analyzed in terms of meta bytes, and eventually rendered accordingly with that analysis, not just the extension, isn't it?
I mean, a .png that is actually a .gif or vice versa will be rendered correctly from a browser anyway so ... the extension is historically a not fully trustable hint.
We can do better, since we've decided the approach is async.. No need to rush conclusions based on file names and extensions, IMHO.
Agree. And on the browser side there are ways to do this like systemjs
which follows WHATWG.
extensions are just hints for the mime type. mime types aren’t generally distinguished between “module” and “script”. if this were the case, why not use .cjs
for all commonjs modules, which need module
, exports
, and require
? or .bjs
for scripts that have access to all the browser APIs?
as said: my main point is that i don’t think commonjs and ES6 modules should interoperate using the same syntax. and once “import commonjs module from ES6” and “import ES6 module from commonjs” aren’t using the same syntax(es) as “import ES6 module from ES6” or “import commonjs module from commonjs”, we don’t need to invent a new file extension.
shipping both variants could be possible via package.json
; some-hybrid-module
’s package.json
contains "main": "bundle-commonjs.js", "module": "bundle-es6.js"
common.js
:
const es6 = await module.import('some-es6-module')
const hybrid = require('some-hybrid-module')
const commonjs = require('some-commonjs-module')
es6.js
:
import {require} from 'commonjs'
import hybrid from 'some-hybrid-module'
const commonjs = require('some-commonjs-module')
this:
const es6 = await module.import('some-es6-module')
is exactly what I've proposed but I've found a lot of skeptical/refractory opinions while to me is the easiest possible way to move forward and it plays well as migration pattern.
my proposal is that this function only works with ES6 modules, though, in order to separate the ES6 and commonjs module worlds.
like i've said, the moment you load asynchronously something is the moment you can also statically analyze what is it (a bit like reading mime type information).
A parser that drops strings and comments and see if there is an /^export /m
in the code doesn't seem like such expensive operation to perform, or is it? If possible, I don't see why we should have two ways to load CJS or ES6 modules through module.import
@WebReflection That approach (analyze the module source to determine the parse goal) has been discussed a lot in the past. The problem is that there are scripts that are ambiguous: scripts that are valid CommonJS & valid JS modules but would do different things when executed (e.g. Object.freeze({}).x = 42;
).
there are scripts that are ambiguous
this looks like a developer issue, not a module.import
one. If there is a single export
is ES6, otherwise it's CommonJS, I don't see any ambiguity but if already discussed I'm sure I'm missing something and I'd like to read more about it.
Hopefully transpilers were out of the equations since these should solve problems, not create new.
There is - currently - nothing in the ES module spec that would force ES modules to import or export anything. The example I gave above was a complete module. It's possible to have a module that consists only of global side effects. And such a module would either run in strict mode (if interpreted as ES module) or not (if interpreted as CJS module). It's impossible to tell what the intended runtime behavior was, just from looking at the source code. It's for the same reason that you wouldn't (shouldn't) run node with the --use-strict
flag.
I give for granted any code I run is strict, and if the idea is to fix the gap between ES6 and CJS modules with a new module.import
API proposal, I wouldn't mind at all, and I would actually expect, whatever I'm importing will run in strict mode.
This wouldn't play 100% well in terms of backward compatibility, but I really would like to know numbers behind the "don't want to run in strict mode" in CJS panorama, specially considering that is always possible to opt in no-strict mode via Function
, when/if really needed.
So I still believe it's a developer issue. There could be some caveat and compromise but I see 99% of use cases happily working all together and moving forward VS something ugly as a new .mjs
extension would be.
At this point I'd rather say next version of node always load .cjs
modules as CJS and all new modules as ES6, that'd be more sensible long term solution for the future since nowadays most developers transpile to target older engines, changing dest file name would be quite simple (yet I think module.import(anything):Promise
is a better solution)
my proposal was that module.import
only loads ES6 modules (and gives a special, helpful error message when you try to load a commonjs module). and ES6 modules are always strict mode.
loading commonjs modules asynchronously is useless, so require
is fine for those.
loading commonjs modules asynchronously is useless
It's about having a single entry point to load modules, no matter how these have been published. How useless is that?
Moreover, asynchronous import enables asynchronous export too.
export default await Promise.resolve().then(doStuff);
This can be transpiled into module.exports = Promise.resolve().then(doStuff)
How useless is that?
| – lib
| | – index.js
| | – module.js // ESM
| | – legacy
| | | – index.js
| | | – module.js // CJS
| – package.json
package.json
{
"name": "package",
"version": "1.0.0",
"main": "lib/legacy",
"module": "lib"
}
lib/index.js
import module from './module'
export default {
module
}
lib/legacy/index.js
const module = require('./module')
module.exports = {
module
}
No interop between CJS and ES2015 require('lib/legacy/*')
or import * from 'lib/*'
CLI
node require.js
node --module module.js
Server
// ES2015
import modern from 'lib'
// CJS
const legacy = require('lib/legacy')
// ES2015
const module = import('lib/module')
module.then((module) => module())
// CJS
const module = require('lib/legacy/module')
module()
Browser
<script src="lib/legacy/index.js"></script>
<script type="module" src="lib/index.js"></script>
@michael-ciniawsky that is still trying to compare potatoes and tomatoes without considering that asynchronous import means also asynchronous export when you have asynchronous dependencies in your module.
This might be considered a non issue on node-land but it's impossible to load synchronously modules on the browser due deprecated synchronous xhr on the man thread.
This direction will force everyone to bundle and it will penalise a lot the Web.
@WebReflection Could explain this a bit further please ? :)
@michael-ciniawsky how are developers going to export modules that use dynamic import
which is asynchronous?
export default await Promise.all([
import(mod1), // parallel asynchronous imports
import(mod2), // as opposite to N implicit await
import(mod3) // ideal for HTTP2 scenarios
]).then([mod1, mod2, mod3] => {
const myModule = {
doStuff() {
return mod1() + mod2() + mod3();
}
};
return myModule;
});
There's no way you can use require
to import a module like that, you need something else that better scale, like the module.import
I've discussed already.
If we keep transpiling ES.next modules into require
we'll fail everyone expectations and intent to ship asynchronous modules on the Web.
I know pure NodeJS devs don't care, but npm is used every day more as universal repository, not only server side and its synchronous, quite outdated, require
mechanism.
I hope I've better explained what I meant.
*lib/.js**
exports default function module1 () => {}
exports default function module2 () => {}
exports default function module3 () => {}
browser/static.js
import module1 from 'lib/module1'
import module2 from 'lib/module2'
import module3 from 'lib/module3'
const use = fetch(url).then((json) => module1(module2(module3(json))))
browser/dynamic.js
const use = fetch(url).then((json) => {
return Promise.all([
import('module1'),
import('module2'),
import('module3')
]).then((...modules) => module1(module2(module3(json)))
})
Why export an asynchronous module ? Do I miss something here ? Isn't the dynamic import()
proposal supposed to return a Promise anyways/always ?
export default await Promise.all([...modules]) // Is this valid?
export default async () => await Promise.all([...modules]) // ?
If we keep transpiling ES.next modules into require we'll fail everyone expectations and intent to ship asynchronous modules on the Web.
There should be no transpiling evolved at all, if code is transpiled to CJS you require()
it and require is synchronous ¯_(ツ)_/¯. Static/Dynamic import (should work) works only with ES2015 Modules. Also there is no native support for CJS in browsers
I know pure NodeJS devs don't care, but npm is used every day more as universal repository, not only server side and its synchronous, quite outdated, require mechanism.
I'm not a 'pure' NodeJS Dev 😛 , I joined the discussion here, bc I'm worried about the .mjs
proposal which would be incompatible with ES2015 modules .js
in the browser forever.
There should be a migration path for node achieved with pkg.module > pkg.main
over time and as you mentioned npm packages, if not requiring node core modules like fs
, should be consumable in the exact same way in node && browsers without any setup.
Why export an asynchronous module ? Do I miss something here ? Isn't the dynamic import() proposal supposed to return a Promise anyways/always ?
Exactly, and if you export a Promise you have an asynchronous export: it's an implicit consequence.
The following:
import module1 from 'lib/module1'
import module2 from 'lib/module2'
import module3 from 'lib/module3'
Is fundamentally different from the following:
Promise.all([
import('module1'),
import('module2'),
import('module3')
])
Is this valid?
It was an example, the point is in exporting something asynchronous that cannot be transpiled into a require()
.
require is synchronous ¯\(ツ)/¯
hence not a suitable transpilation target for asychronous modules.
Static/Dynamic import (should work) works only with ES2015 Modules.
No, why is that? It can be transpiled keeping asynchronous semantics/nature.
Also there is no native support for CJS in browsers
It never mattered, did it? Browserify works already for require
?
Today we have bundlers, tomorrow we could use this utility which is already asynchronous and it already solves the issue and it brings CJS to browser, or anything similar, really, brought in by bundlers or transpiled.
I'm worried about the .mjs proposal
so am I
There should be a migration path
which is what I've explained already via module.import
Is fundamentally different
Yep, I understand, but isn't that handled by import()
?
It was an example, the point is in exporting something asynchronous that cannot be transpiled into a require()
Why would somebody want to do that? :) When node supports import()
=> use import()
. require()
is 'legacy' 😛.
hence not a suitable transpilation target for asychronous modules
Yep, so may proposal here is to not even try :D
It never mattered, did it? Browserify works already for require?
Because Browserify, webpack, Rollup transforms CJS to e.g an IIFE, if you have a CJS codebase => keep using one of this tools. Otherwise enjoy <script type="module"></script>
and forget about them in this context
which is what I've explained already via module.import
BTW I'm not against module.import
in particular, so no offense here, but if I hear words like transpile
, I'm skeptical by default 🙃
so am I
👍 💯
We're not adding much so I'll stop here, but:
if I hear words like transpile, I'm skeptical by default
Your proposal will be brought in by tools and tools that will transpile and generate the legacy part 'cause I don't think anyone will ever write that hierarchy by hand with duplicated modules.
Hence my underlying that require
is not a good target for asynchronous modules, we need a better mechanism in CommonJS land.
I honestly don't understand what's so difficult to digest about it (not talking about you, but in general)
Your proposal will be brought in by tools and tools that will transpile and generate the legacy part 'cause I don't think anyone will ever write that hierarchy by hand with duplicated modules.
Well, I can't really tell (that would be speculative), maybe you're right, as long as .mjs
is not going to happen / can be avoided anything better is highly welcome and appreciated I guess. My repetition of the package.json
proposal by Yehuda Katz && Dave Herman should only function as a reminder :D, it's by no means the only solution, but the most promising imho.
OK, so let me rehash my idea with new information.
.cjs
).mjs
)Module type | Exports | Imports |
---|---|---|
CommonJS | module.exports = CJS |
require('CJS') await import('ES6') module.import('ES6') |
ECMAScript 6 | export [default] ES6 |
import [... from] 'ES6' await import('ES6') import require from 'commonjs'; require('CJS') |
require()
can’t import ES6 modules. (NodeJS does not get top-level await
, so we add a synchronous module.import()
that serves that role). require('mod')
will result in a syntax error if mod.js
contains export
or import
statements.
import [... from]
and import()
can’t import CJS modules. (they don’t benefit from asynchronous loading anyway). import 'mod'
will result in a “name undefined error” if mod.js
does not define module
or require
, but uses them (e.g. if it’s accidentally a CJS module someone tries to import with a ES6 mechanism).
if a file mod.js
exists for its side effects and contains no exports or imports of any kind (or only import()
), it’s both a valid ES6 and CJS module and can be imported by both. (It could behave differently due to implicit strict mode in ES6 modules, but there’s no way around it: The preceding sentence is no proposal, but already fact)
A npm package’s package.json
can contain the main
and module
entry points for CJS and ES6, respectively, turning the package hybrid (but not its modules! require('pkg')
will import a different file than import 'pkg'
!)
builtin node packages like fs
will behave that way and be importable with every mechanism.
A consequence is that modules written in ES6 but relying on babel or so to transform import
into require()
(transforming them into CJS modules) can be converted into true ES6 modules.
For this, the build step will be removed, and each imported non-hybrid package has to be replaced manually or by package.json
-aware codemod: import foo from 'cjs-only'
→ import require from 'commonjs'; const foo = require('cjs-only')
module.import
(or the same thing under another name` needs to be addedimport
) or without (require
) strict mode@flying-sheep
NodeJS gets top-level await, so import() serves that role
note that while import()
is coming to Script/CJS top level await can't since it changes run to completion. It would mean CJS would not run to completion in a blocking manner. With Node version 7.6 it is possible to change the .js
file wrapper to an async function to achieve the proper grammar, but it breaks how CJS works fundamentally since all CJS would return Promises.
import [... from] and import() can’t import CJS modules
I assume that builtins could still be loaded even though they will stay CJS.
import 'mod' will result in a “name undefined error” if mod.js accesses module or require.
Unclear what this means, you mean if they use variables named module
or require
? What would var require = () => {};
in your source text mean wrt this error?
if mod.js exists for side effects and contains no exports or imports of any kind (or only import()), it’s both a valid ES6 and CJS module and can be imported by both.
Note, this can change the behavior of a source text.
note that while
import()
is coming to Script/CJS top level await can't since it changes run to completion
damn, makes sense. i’ll add module.import
again. /edit: done
I assume that builtins could still be loaded even though they will stay CJS.
i assume so too. i added a bit about hybrid packages, which would fit here.
Unclear what this means, you mean if they use variables named
module
orrequire
?
i meant that if you accidentally use a ES6 import mechanism to import a CJS file, it will just encounter access to undefined variables names module
or require
and throw the usual error. defining them of course works normally.
Note, this can change the behavior of a source text.
yes, you need to be aware of this pitfall (or use strict
in all CJS), but there’s no way to avoid it AFAIK. files without imports/exports are valid CJS and ES6 modules already.
/edit: clarified everything
"require can't import ES modules" makes it a nonstarter imo, because that means I can't transparently refactor my CJS module to an ES module, which is a critical goal.
@ljharb that can be ok as long as poly packaging works so you can ship a CJS form as well.
@ljharb that’s not true.
require('relative-path')
calls to import ...
@flying-sheep the concern is for people consuming your package. If you are a module author of foo
. You need people to continue to be able to use require('foo')
even if you also support import('foo')
so you don't break people consuming your module when you upgrade to ESM.
yeah, as you said: that’s easy due to hybrid packages.
@flying-sheep in order to support things like deep linking ala require("lodash/chunk")
you need to have the full set of things just like Defense of .js
yeah, i always hated that main
doesn’t work as root for paths like this.
@bmeck Will something like module.import()
or require.import()
be implemented ? 😕 I think they are adding even more noise and confusion. Nice work so far, the proposal is becoming much cleaner, still emphasizing to reconsider .mjs
if possible 😛
as long as poly packaging works
? Is package.json
not 100% of the table, or I'm not getting it 🙃 ?
@michael-ciniawsky any solution that encapsulates use cases and is not prohibitive to adoption or education is open for discussion. Defense of .js is a very well thought out solution, but remains complex in order to achieve high use case coverage.
As per *.import()
I don't think it will be shipped in light of import()
coming down the pipe.
[edit] note that a goal is hopefully to allow a path that all new code can be written in ESM without causing a perpetual burden to either Node or Browsers.
As per *.import() I don't think it will be shipped in light of import() coming down the pipe.
👍
What parts of Defense for '.js' are prohibitive ?
but remains complex in order to achieve high use case coverage
?
What parts of Defense for '.js' are prohibitive ?
package.json
changing how files are interpreted. Means checking package.json
becomes something to do if you ever are in doubt. If code uses import()
the intention is still unknown.
"module"
swaps the whole package unless you have a "main"
. kind of odd if you add a "main"
later on to support CJS.
"module.root"
changes pathing behavior in odd ways sometimes, like how import("../")
wouldn't escape your package it would escape the "module.root"
import "foo/bar"
and it not mapping to disk.As has been stated many times, the decisions here are about weighing cost/advantage of all the things. There are also downsides to .mjs
that have been discussed as well. The prevailing thoughts have been that it is the simple and does not leave lasting persistent burden when moving between environments or learning for the first time.
"module" swaps the whole package unless you have a "main". kind of odd if you add a "main" later on to support CJS
IMO as things moving forward, this scenario would be rare. 🌷
"module" swaps the whole package unless you have a "main". kind of odd if you add a "main" later on to support CJS.
That shouldn't be the case of course, module
applies where it is set. Everything else is treaten like main
(CJS)
"module.root" changes pathing behavior in odd ways sometimes, like how import("../") wouldn't escape your package it would escape the "module.root"
module.root
is awkward. Would it be mandantorily needed? e.g => module: [ entry.js, lib ]
etc.
doesn't allow easy mixing in the same directory (if porting a large app you have to place all ESM in the root and all CJS outside the root
kk, is separation not even better ? Why in root
only, is in different directories not enough?
File based CLI usage without package.json
node --module module.js
$HOME/.noderc
node module.js
{
"module": true // node => node --module
}
@michael-ciniawsky
module.root is awkward. Would it be mandantorily needed? e.g => module: [ entry.js, lib ] etc.
There were discussions of using glob
like patterns to whitelist which files should be treated as ESM. This is the "modules"
field I didn't bring up in last comment.
$HOME/.noderc
Would be adding a file in a manner that we haven't seen before. Unsure how core would feel about making a settings file standard. You could bring it up though, might be useful even w/o ESM.
kk, is separation not even better ? Why in root only, is in different directories not enough?
This is complicated, it depends on what you setup in "modules"
if you have "main"
. Reading the proposal would explain this.
File based CLI usage without package.json
--module
is also needed for .mjs
for STDIN/-e
/-p
, but .js
for package.json based approaches do not give information on how files are supposed to run. Reading the file to figure out if it is ESM or not is necessary, even then if ambiguous, check the docs.
On package.json: we add 3 new fields in package.json (which are interdependent), potentially a settings file, and still gotchas about figuring out if a .js is supposed to be one mode or the other.
On file extension: we add a character to your file, and make sure to update code editors etc. to know about the MIME.
On pragma / grammar change: we are similar to file extension (w/o MIME question), but won't get it through standards. Questions remain about ambiguity if not mandatory.
if consumers have to know or care about what module format I'm using, it's going to be a disaster for everyone
As a module consumer (I've never written any npm packages), I would rather be aware which module format I'm using than be bitten by unexpected bugs. For example, webpack supports AMD module format and it has already caused issues for me (leaking global).
If I'm maintaining an old codebase (require
s all over the place), I would rather expect module authors to pubish a commonjs version if there's a will to support older version of node. If module author doesn't support new versions of node, I'd still be forced to not upgrade to a new node since there's no guarantee it will work the same.
If I'm mainaining a new codebase, I would rather expect new version of an npm package to contain es6 version of a module rather than having to append -es
in every import line.
Would it be reasonable to expect a --disable-commonjs
flag for people that want the cleanest and freshest version of node? Even if it means that their choice of libraries would be limited.
The above is from a consumer perspective. Producers would probably want their packages to work indefinitely without any maintenance.
web UI problem caused me to squash, old branch is at https://github.com/bmeck/node-eps/tree/rewrite-esm-bak
@targos please re-review
What about direct execution of ESM code from the CLI? Should we specify this here?
I'm thinking about node -e "import foo from 'bar'; foo();"
@targos I've left the CLI flags out of this EP as it is not something I think belongs in the EP itself. The general consensus is that you must include a --module
flag to change goal of source passed via argv or STDIN
landed in 6cc060e94e56859bdb446a0820ef4704731ff0a8
some discussions and problems with supporting various behaviors on a VM level, races with browsers, and upcoming spec changes have led to a drastic change in direction for the interop bridge.
https://gist.github.com/bmeck/52ee45e7c34d1eac44ce8c5fe436d753 has some relevant notes
notable:
import
is url based../
,./
, or/
use as base URL against current script and searchnode_modules
../
escapingnode_modules
import
ALWAYS unwind stack prior to any evaluationimport('old.js')
(spec being written) would always unwind the stack prior to evaluating.Promise<ModuleNamespace>
always, never synchronousdefault
export__filename
,__dirname
,require
,module
,exports
,arguments
this
value