Closed joliss closed 9 years ago
I would have thought this module would already do the module.exports = Foo
thing when it encounters an export default Foo
in a package because export default
was included because node and other package ecosystems proved the value of single-exports and without it interop would have been really painful. I'm very +1
on converting default
exports to module.exports
.
The problem is that the ES6 semantic module and the Node/AMD semantic model are fundamentally incompatible, by design (X_x). Trying to reconcile them, e.g. by making a default export and the module instance object into the same thing like Node does, will lead to numerous problems of a much worse and more subtle sort than the existing ones.
The existing solution has the virtue that transpiled code will work the same as non-transpiled code. On the contrary, the proposed "daring suggestion" will break that property. You will build up a large library of code meant to work with the transpiler, and then try to use native ES6 support, and everything will break, because you used the transpiler's daring conflating-semantics instead of ES6's separation-semantics. This hazard seems, to me, unacceptable.
The best solution, I believe, is to work on modifications to Node's module loading mechanism to be ES6 aware. This could be done in user-space with require.extensions[".js"]
hacks. This is the only forward-compatible path I see that does not blow up when actual ES6 support lands in V8.
I believe this is a solved problem. @wycats/@thomasboyt should likely confirm
export $, { ajax } from 'jQuery';
// tag the root exported object with the es6 additions
jQuery._es6_module_something = {
ajax: jQuery.ajax
};
module.exports = jQuery;
imports should be mostly compatible
So (Stef please correct me if I misunderstood) the idea is that the transpiler would tack ._es6_module_exports
onto the default
export. This works because the default
export must always be an object or a function, never a primitive type. Node supports defineProperty
, so we can use that to make _es6_module_exports
non-enumerable as well, to minimize the chances of the property getting in the way.
This seems reasonable to me.
This problem goes beyond the server use-case IFAIK, for example: when trying to do interoperation between an AMD module and transpiled to AMD module, which is a common use-case if you have a large scale project with a lot of legacy pieces where new pieces are going to be written in ES6. The AMD module will have to know the nature of the transpiler when importing stuff (e.g.: know that mod['default']
is the actual default export).
we have been thinking about this issue for a while, and the only solution that I can think of is the inverse of what @stefanpenner is proposition, here is the pseudo code for a module exporting a default function:
// es6
function Foo() {
// default function to export
}
export default Foo;
// transpiled to CJS
function Foo() {
// default function to export
}
module.exports = function () {
return module.exports['default'].apply(this, arguments);
};
Object.defineProperty(module.exports, 'default', { value: Foo });
this approach will work just fine in CJS, AMD and YUI, we can define more properties on module.exports
, in case we need more regular exports, and we can even freeze them to be more align with ES6. Of course, for older browsers we can use regular properties.
As for exporting a default object instead of a function, we can do the same by using the function as a shim for the actual object:
// es6
function SomethingElse() {
// function to export
}
var foo = {
bar: function () {},
baz: 2
};
export default foo;
export SomethingElse;
// transpiled to AMD
function SomethingElse() {
// function to export
}
var foo = {
bar: function () {},
baz: 2
};
module.exports = function () {
// we could throw in here...
};
Object.defineProperty(module.exports, 'bar', { value: foo.bar });
Object.defineProperty(module.exports, 'baz', { value: foo.baz });
Object.defineProperty(module.exports, 'SomethingElse', { value: SomethingElse });
Does this makes any sense? /cc @ericf
I would encourage everyone to re-read #66 and #69, where this all has been discussed before. In particular, they give concrete examples of the hazard I spoke of above, where transpilation strategies like this would allow you to write code that is not ES6-semantics compatible, even though it uses ES6 syntax.
@caridy not saying anything negative about your suggestion, but my example was not a proposition, rather I believe it is the solution @wycats / @dherman / @thomasboyt and friends came up with.
Although the transpiler does not yet implement it yet, the plan is for it too.
If you have specific concerns, it is likely important to bring them up.
@domenic wrote:
The best solution, I believe, is to work on modifications to Node's module loading mechanism to be ES6 aware. This could be done in user-space with require.extensions[".js"] hacks. This is the only forward-compatible path I see that does not blow up when actual ES6 support lands in V8.
It sounds like these hacks would have to be activated before require'ing the transpiled module in question. As a library author, if I'm putting an ES6-to-Node-transpiled module on GitHub (say like rsvp), I probably wouldn't want to ask users to activate hacks before require'ing my module, would I?
Also, would we be able to isolate this to ES6 modules? I'm also worried that this might make the module loader infrastructure brittle in subtle ways.
Can you elaborate a bit more on what you're envisaging?
Re @caridy's suggestion: @caridy, I'm not sure I understand your code, especially why the default export is turning into a function. Ping me on IRC maybe? I also believe @domenic's worry of creating a hazard when we move to untranspiled ES6 is quite justified: We wouldn't want import defaultExport from "foo"; console.log(defaultExport.namedExport)
to work in transpiled code, because it doesn't work in ES6.
Re @stefanpenner's solution:
Stef, I wonder about the problem that @caridy alluded to: In untranspiled code, how are we planning to consume named ES6 exports? It seems to me that you'd end up with
define(["jQuery"], function(jQuery) {
// This is not transpiled, but actual code that library users would have to write:
var ajax = jQuery._es6_module_exports.ajax;
});
or
var ajax = require('jquery')._es6_module_exports.ajax;
This syntax is so awkward that, as a library author, you basically can't use the named exports feature for your external API. (Assuming you care about Node or AMD, which most of the time you do.) You could of course copy your library's named exports onto the default
export object, but that seems cumbersome, at least for new modules.
Is this limitation acceptable? (Or am I missing some other solution?)
@domenic, am I correct that the forward-compatibility hazard you're talking about doesn't apply to Stef's example? I believe module foo from "foo"
would transpile to var foo = require('foo')._es6_module_exports
. So as long as code doesn't rely on having the clearly-transpiler-internal _es6_module_exports
property on the default export, would this allow anything in ES5 that would be impossible in real ES6?
About the "In untranspiled code, how are we planning to consume transpiled modules?" scenario, what we do today in YUI is to have a feature flag (very similar to _es6_module_exports
), plus a small modification on the YUI loader to be able to load those modules transpiled from ES6, and treat them differently. To be precise, in you look into the YUI transpiler, we add "es": true
token into the transpiled module meta to let the YUI Loader to treat them differently in terms of default exports vs regular exports. We could perfectly use the _es6_module_exports
flag instead. So, it seems that the problem that we are discussing here is a multi-dimensional problem that involves the transpiler and the corresponding loader used to load those transpiled modules.
My main concern though, is the future, because today we have a two way street with one way being block by construction :), and that's why we are looking for solutions to write ES modules and be able to use them in some of the existing loader implementations, but the problem of tomorrow (when the blocked lane gets ready) is going to be how to load legacy modules (AMD, CJS, YUI, etc) thru an ES Loader instance without changing them (or transpiling them), otherwise we will end up transpiling them into ES, lol.
That's why var ajax = require('jquery')._es6_module_exports.ajax;
is just unacceptable today, because it is going to be a pain in the future when ES jquery module gets loaded by an ES Loader, and gets used by a legacy CJS module that is trying to access the ._es_module_exports
member, and there is little that the loader can do to facilitate that because it will not know how the legacy module is going to use the imported ES module.
btw, if we end up choosing _es6_module_exports
, lets make sure we use _es_module_exports
instead, adding the version of the spec is an anti-pattern.
es_module_spec
is a bit silly because technically node and AMD are all ES.
Because of 1JS, ES7 modules will just be a superset of ES6 modules. What about _syntactic_modules_
or _built_in_modules_
.
Another option would be for ES friendly loaders to produce a generated symbol (a la jQuery) at System.brand, and loaders use that one if they find on boot, or generate a new one if not.
Then all generated modules can do exports[System.brand] = ...
the package is called called es6-module-transpiler
, let's just call them __es6_transpiled_module__
s
@wycats: I think the generated symbol thru System.brand
is flexible enough.
But we have been trying out few ideas with the loader, and there is one thing that is is getting messy when using System.import()
to import various modules. For what I can tell, System.import()
will provide access to an object with all named exports, and it is the responsibility of the user to access the default
export manually, which is different from all other loaders out there (cjs, yui and amd), which means we will end up doing something like:
System.import(['foo'], function (foo) {
// how to access the default export in here? maybe `foo[System.brand]`?
// and how to access a named export in here? maybe `foo.something`?
});
The same applies when loading a transpiled module into amd
, cjs
and yui
. It seems that no matter what we do at the loader level, System.import
's callback is expecting a very simple api where default
is, in fact, a member that you access intentionally in the callback.
The question is: does this means that default exports are mostly for module
-to-module
interoperability while named exports are more suitable for app level code?, it seems to me that that is the case, which I'm totally fine with it. Which means you will do something like this:
System.import(['jQuery'], function ($) {
$.ajax(); // which is nice and clean, but what about default exports?
});
which is going to be equivalent to use the import syntax for a named export at a module level:
import {ajax} from 'jQuery';
So far so good, it looks nice, clean, but that's only if you DO NOT use the default export for jQuery
module.
This is why we were playing around with the shim mechanism described in https://github.com/square/es6-module-transpiler/issues/85#issuecomment-30072115, to try to get to a middle ground where:
System.import(['foo'], function (foo) {
// `foo()`to execute the default export
// and `foo.something` references to a regular named export
});
Is this ideal? is this doable? I don't know :).
System.import(['foo'], function (foo) { // `foo()`to execute the default export // and `foo.something` references to a regular named export });
This is exactly the kind of code I am worried about. ES6 does not work this way. You need to do foo.default()
in ES6 to access the default export. foo
is a module instance object, always: i.e. a null-prototype, frozen object with a bunch of getters which live-get the binding value. It is not the default export, and is nothing like the AMD/Node semantics you (and everybody else) are wishing for.
@domenic alright, fair enough. I'm sold with solution 1 from @joliss then, If .default()
is here to stay, I'm good with that. I guess the best way to educate people on this is to avoid the sugar for default when talking about ES6 modules. E.g:
import { default as something } from 'foo';
instead of
import something from 'foo';
because it is easier to translate that (visually) to others systems:
var something = require('foo')['default'];
or
define(['foo'], function (foo) {
var something = foo['default'];
})
and even when it comes to load it at the app code level:
System.import(['foo'], function (foo) {
var something = foo['default'];
});
I can't figure out how the __es6ModuleExports
solution would support cycles for named imports.
Say you have module a.js
reading
import { bar } from 'b'
export default function() {} console.log('In a. bar = ' + bar)
export let foo = 'foo'
and package b.js
reading
import { foo } from 'a'
export default function() {} console.log('In b. foo = ' + foo)
export let bar = 'bar'
My understanding of the proposed solution is that the transpiled a.js
would look something like:
var _b_module = require('b')
module.exports = function() { console.log('In a. bar = ' + _b_module.__es6ModuleExports.bar) }
module.exports.__es6ModuleExports = { 'default': module.exports } // perhaps
module.exports.__es6ModuleExports.foo = 'foo'
(b.js
analogously.)
Because the module.exports
object needs to be replaced with a new function object in line 2, the circular require
won't work.
I have the same question @joliss, if we have to deal with default
anyways, how will __es6ModuleExports
help aside from the mere fact that it flags the module as ES module so loader can behave slightly different.
@joliss in your final example, can you show me how you would expect that to work with existing CommonJS modules?
@joliss or is your question about the fact that the claim is that we could make the full set of circularity features work when compiling from ES6 modules, while maintaining compatibility with existing node practice?
@joliss here is a gist of how I imagine it would work: https://gist.github.com/wycats/7983305
@wycats, I think your gist to make cycles work in a.js
and b.js
looks good. (Yes, I only care about circularity within transpiled ES6.)
So to summarize for everyone, the idea would be: When ES6 modules are transpiled to CommonJS (and presumably AMD), their module object is the default
export:
// Get ES6 `default` export from transpiled 'metamorph' module
var Metamorph = require('metamorph')
ES6 named exports are accessible from other transpiled ES6 modules (implemented via a hidden __es6_modules__
property tacked onto the default export object) but are inaccessible from untranspiled Node and AMD code.
This would mean that ES6 modules that need to work on Node will have to generally expose their named exports on the default
object:
export { foo, bar }
// foo and bar are part of this package's external API, so we add an
// artificial default export to make them accessible from Node:
export default { foo: foo, bar: bar }
Note: Therefore in practice, any code that needs to work on Node cannot have named exports and a separate default
export. That's OK, because Node simply doesn't have the necessary module semantics.
Let's go the other way and ES6ify an existing Node module. Say you have a module with this interface:
exports.foo = 'foo'
exports.bar = 'bar'
If you wanted to migrate the source to ES6, but continue to provide the transpiled output to Node, then this is equivalent ES6 code:
export default { foo: 'foo', bar: 'bar' }
Presumably, you would also add named exports as a new API into your module:
export default { foo: 'foo', bar: 'bar' }
// As a courtesy to our fellow ES6 users, we add two named exports to our API:
export var foo = 'foo'
export var bar = 'bar'
I want to argue one more thing: To ES6 code, non-ES6 Node modules should look like they have one default
export and no named exports.
First, observe that clearly this should work
import mkdirp from 'mkdirp' // mkdirp is a regular Node module
as it's semantically correct. (I have seen people suggest module mkdirp from 'mkdirp'; mkdir(...) // yuck
, but as @domenic has pointed out this breaks if mkdirp
ever becomes an ES6 module, because there is no way to expose this kind of interface from ES6.)
Now it's tempting to try and make this work:
module fs from 'fs'
fs.stat(...)
But I want to argue that the 'fs' module should only expose its exports on the default
object, so you should write this instead:
import fs from 'fs' // yes, really
fs.stat(...)
Here's why:
First, to make module fs from 'fs'
work, we'd have to guard against fs
being a function, so we'd need to copy stuff around. I think we should avoid this. That's because fs
is a foreign object not under the transpiler's control, so I'd anticipate there will be weird edge cases where the behavior is slightly wrong - mucking with foreign objects is fundamentally asking for trouble. (This is unlike tacking on the .__es6_module__
property, which is acceptable mucking because it only gets tacked onto objects under the transpiler's control, not foreign objects. So if I use the transpiler, I know to expect this, and I can accommodate it in my own source.)
Second, if module fs from 'fs'; fs.stat(...)
worked, it would mean we are now promising both of these to work in ES6 land:
import fs from 'fs' // works because 'fs' has a default export
fs.stat(...)
// *and*
import { stat } from 'fs'
stat(...)
So if you ever were to turn fs
into an ES6 module and supply transpiled source, you'd have to have a default
export with a stat
property, and a named stat
export.
That might be an acceptable promise for stat
, but we'd be making this promise even for properties that happen to exist on the default
export but don't necessarily make sense as named exports:
var someObject = new Foo
someObject.someProperty = 'I am internal to the Foo instance'
module.exports = someObject
If I ES6ify this, I technically have to export var someProperty = someObject.someProperty
as a named export, because some fool might be doing import { someProperty } from 'module'
in their library, and I don't want to break people's code. That's not good.
So in summary, for a regular non-ES6 Node module 'fs'
, module fs from 'fs'
should simply amount to var fs = { 'default': require('fs') }
, and import { stat } from 'fs'
should amount to var stat = undefined
. Given the tradeoffs, I think this is an acceptable limitation.
Finally, I believe all of the above should work the same way in AMD land.
@joliss thanks for starting a discussion around this - it is important stuff to work out.
I like the ideas, I'm just trying to work this out. Apologies in advance if I'm missing something, but let me know if this sounds about right with what you are suggesting:
ES6 Module:
import { q } from './some-dep';
export var p = 'hello';
export default 'test';
CommonJS Transpiled Version
var q = require('./some-dep').__es6_module.q;
module.exports = 'test';
exports.__es6_module = {
p: 'hello',
default: module.exports
}
By adding this es6_module
metadata, we've now changed the underlying default export surely?
Or does the require()
statement resolving to the default import only happen in NodeJS?
@guybedford Yes, that's the plan; we'd be using defineProperty
to make the __es6_module
property "invisible" to enumeration, so that it doesn't interfere. Note that this doesn't work on strings (export default 'test'
in your example) -- I'll leave a separate comment about that -- but it works if your default export is a function or object.
Here is an example illustrating how defineProperty
works:
var defaultExport = function () {}
defaultExport.regularProperty = 'foo'
// Like `defaultExport._es6Module = { namedExport: 'test' }`, but not enumerable:
Object.defineProperty(defaultExport, '_es6Module', {
enumerable: false,
value: { namedExport: 'test' }
})
// _es6Module can be read like a regular property ...
console.log(defaultExport.regularProperty) // => foo
console.log(defaultExport._es6Module) // => { namedExport: 'test' }
// ... but is invisible to enumeration
console.log(Object.keys(defaultExport).indexOf('regularProperty')) // => 0
console.log(Object.keys(defaultExport).indexOf('_es6Module')) // => -1
for (key in defaultExport) if (key === '_es6Module') throw 'never happens'
// As an aside, note that hasOwnProperty is still true
console.error(defaultExport.hasOwnProperty('_es6Module')) // => true
When the default
export has a primitive type (string, number, boolean, null, undefined), the _es6Module
trick does not work, because someString._es6Module = {}
silently fails. This becomes a problem when we want to have both a primitive default
export and named exports. I see several solutions for this edge case:
_es6Module
property.[1]typeof
the default
export at run time. If it is primitive, and there are named exports, throw an error. This is similar to (1) in that primitive-default-plus-named-exports would in effect be unsupported by the transpiler, but it fails noisily instead of silently. It will require a bunch of extra code to do this.typeof
the default export at run time. If it is primitive, and there are named exports, make the default export invisible to untranspiled modules. In such cases the module would only export { _es6Module: { 'default': 'some primitive', namedExport: ... } }
. It seems to me that this is the most desirable behavior. It preserves functionality in ES6 land. You'll only have to know to avoid this when you are designing an interface to be consumed by non-ES6 code (e.g. the public interface of a Node module). But named ES6 exports are rare in public interfaces anyway. The disadvantage of this solution is that it adds complexity. I worry that there will be hidden problem that are only found very late, because this code path is used so rarely.I'm uncertain. I was going to suggest (3), but this scenario is so edge-casey that perhaps we can go with (1) for now, and document it as a to-do.
[1] Technical side-note: The named exports would be briefly visible to cyclic imports, and then disappear once the primitive default
exported is exported.
Just had a bit of an "oops" moment. The _es6Module
trick actually breaks here:
import defaultExport from 'otherModule'
export default defaultExport // this will blow away otherModule's _es6Module property
Yehuda seems to agree, and there's no immediately-obvious workaround. We'll do some brainstorming and report back here.
@joliss thanks for the examples, hadn't considered all these scenarios. Have been putting some thought to this too. How about having a meta export in the transpiled version that indicates when NOT to set a CommonJS or AMD export as the default export only?
import { q } from './some-dep';
export var p = 'first';
export default {};
transpiles to:
exports.__transpiled = true;
exports.p = 'first';
exports.default = {};
Then in the loader implementation, we would read the CommonJS module normally to represent a module like:
new Module({ default: module.exports });
But when a module exports with a __transpiled
property set, we know to make the module object from the exports directly:
if (module.exports.__transpiled)
new Module(module.exports);
This way, the ES6 module is the same in ES6 and transpiled CommonJS.
Sorry - my last comment was very much seeing this from an ES6 loader perspective, which is off topic. My apologies.
From the NodeJS side another idea is to have a metadata property that implies that when loaded, one should use the default property only.
For example, the NodeJS require could implement something like:
// internal NodeJS require function
var require = function(name) {
// normal require
var module = require(name);
// check for a metadata implying "useDefault"
if (module.__useDefault)
return module.default;
else
return module;
}
This way, module authors would have control over how their modules are treated in Node, simply with:
export var p = 'export';
export var __useDefault = true;
export default function() {
}
Transpilation stays the same in this scenario. Hopefully that gives some food for thought at least, and doesn't muddy these waters too much.
@guybedford Unfortunately this does not work because we cannot change Node's require
function. We obviously cannot change existing Node installations, but I also doubt that Isaac would even accept a patch to such a critical piece of Node's module infrastructure.
OK, new proposal: Support named and `default` exports on Node (and AMD), but not both at the same time. This allows for all of the ES6 module functionality between transpiled ES6 modules, and allows for a reasonable subset to be consumed by untranspiled Node modules.
To transpile imports, as before we check for an `._es6Module` property. For instance, `import { foo } from 'mod'; console.log(foo)` transpiles to:
``` js
var _rawModule = require('mod')
var _mod
if (_rawModule != null && _rawModule._es6Module != null) {
_mod = _rawModule._es6Module
} else {
_mod = { 'default': _rawModule }
}
// All uses of `foo` are replaced with `_mod.foo`
console.log(_mod.foo)
```
But to transpile exports, we check what kind of exports we have in the module:
(1) If only `default`:
`export default foo` simply transpiles to `module.exports = foo`. The `default` export will be available to both untranspiled consumers and ES6 consumers.
(On a very technical note, cycles between modules with `default` exports, which are supported in ES6, can be made to work: We attach `var _tmp = {}; module.exports = { _es6Module: _tmp }` at the very top, and when `export default foo` comes, we emit `module.exports = _tmp.default = foo`.)
(2) If there are named exports (with or without `default`):
`export { foo, bar }; export default defaultObject` transpiles to
``` js
module.exports._es6Module = {}
module.exports.foo = module.exports._es6Module.foo = foo
module.exports.bar = module.exports._es6Module.bar = bar
module.exports.default = module.exports._es6Module.default = defaultObject
```
This means that untranspiled consumers see a plain old JavaScript object with `foo`, `bar`, and `default` properties, but ES6 consumers see named `foo` and `bar` exports and a `default` export.
Having a `default` property is clearly somewhat cumbersome on Node, but I suspect that in many cases such modules are internal to ES6 packages, so we are primarily concerned about having them work correctly between ES6 modules. When you are defining an interface that is to be consumable in Node land, you'd probably limit yourself to only default or only named exports. For the rare exception to this, we have 2b:
(2b) There are cases where we need to preserve a pre-existing Node/AMD interface with a default export _and_ named exports. For instance, if we wanted to ES6ify [mkdirp](https://github.com/substack/node-mkdirp), which has `mkdirp` and `mkdirp.sync`, on ES6 we'd want to map those into a `default` export and a named `sync` export. But for untranspiled consumers, we'd want to have the `mkdirp` function as the default export (not behind a `default` property), and the `mkdirp.sync` function as a property on it. It's probably easiest to tell people to write a Node wrapper in these cases. The Node wrapper just needs to preserve the `._es6Module` property of the original module; in other words, it would tack it on like so:
``` js
var transpiledMkdirp = require('./lib/mkdirp')
// Construct `mkdirp` for Node
var mkdirp = transpiledMkdirp.default
mkdirp.sync = transpiledMkdirp.sync
// Preserve ES6 semantics for transpiled ES6 consumers!
mkdirp._es6Module = transpiledMkdirp._es6Module
// Export on Node
module.exports = mkdirp
```
---
This proposal has the disadvantage that you have to be careful about what exports you add on existing public interfaces of your packages. For instance, if you accidentally added a named export to a public-facing file like [router.js](https://github.com/tildeio/router.js/blob/0a521d53e37253e54015306d83a45cd2489f9210/lib/router.js#L175), the default export would disappear behind a `.default` property on Node. That seems acceptable to me, since often there is only one such file per package.
---
Finally, a subtle note on (2): While named ES6 exports map to plain old JavaScript objects for untranspiled consumers, the same does not symmetrically hold when we import from untranspiled modules. For example, we would still write:
``` js
import fs from 'fs' // sic - not "module"
fs.link(...)
// This does not work:
import { link } from 'fs' // undefined or error
```
This is because trying to guess named exports from the `'fs'` module object seems like it's asking for a lot of trouble (see [my argument above](https://github.com/square/es6-module-transpiler/issues/85#issuecomment-30745525)).
This has the unfortunate effect that if we were to ES6ify the `fs` module in the future, and supply a transpiled version to Node, then writing `fs.js` as
``` js
export { stat, link, ... }
```
will work for existing untranspiled consumers, because those continue see a plain old JavaScript object, but it will break for existing ES6 consumers, because we now have replaced the `default` export with a bunch of named exports. If we need to keep compatibility with existing ES6 consumers, we need to add a `default` export object with all the named exports as properties, and then also a Node wrapper because we have named and `default` exports.
Since ES6ifying existing modules is only a sub-use-case, and ES6 is also not widely used yet, I think that this is acceptable, but it's still a bit of a wart.
Not being able to consume import { link } from 'fs'
does not seem very subtle to me. Everybody gets super jazzed about the "destructuring"; I can see having to explain the difference between an es6 named export v. the property of a default export getting tiresome, but maybe its necessary.
Responding to @guybedford's issue #86 (see there for context):
I believe that the most recent proposal (two comments above) covers this use case adequately. _es6Module
takes the role of __module
, and instead of being a boolean, it contains a copy of all the ES6 exports (which allows you to do some fancy tricks when you need the Node/AMD and ES6 interface to diverge). Please take a look at the proposal and see if this covers it.
The only disadvantage compared to a boolean __module
flag is that providing an ES6 interface from an untranspiled Node/AMD module is a bit more cumbersome:
exports.someExport = 'hello world';
//exports.__module = true
exports._es6Module = { someExport: exports.someExport }
Do you think this use case is common at all? My sense is that it isn't: When you have a module written in Node or AMD, and you want provide an explicit interface to other ES6 modules, I think you'd generally turn it into actual ES6 and continue to provide transpiled Node and AMD files. Compared to tacking on __module = true
, this also enables you to benefit from static import parsing. But perhaps I'm missing use cases here.
FYI all, I've simplified my latest proposal slightly to handle named exports with and without default
export the same (both are case 2 now).
I'm not sure it is adequate to transpile ES6 modules back into CommonJS, just to get ES6 module support in NodeJS. And surely that is what this proposal is advocating?
Personally, if I were writing a NodeJS app using ES6 module syntax, it would feel very odd to run a build step just to load standard JavaScript modules.
It is very worrying that require
is completely locked down for these changes, and one does get the feeling that Node will not easily move into this.
Is that really the best we can hope to achieve?
@guybedford I hope that once modules land in V8, that Node will build its module system as a loader on top of ES6 modules. In the meantime, transpilation works and will get us ES6 modules in node on a shorter timeframe for people who want to build JavaScript libraries using ES6 syntax that work both in the browser and node.
@wycats I just worry that we'll end up in a position where ES6 modules become a much harder thing than they need to be.
ES6-Module-Loader completely supports loading ES6 in NodeJS, with the syntax:
index.js:
var System = require('es6-module-loader').System;
System.import('./local-es6').then(function(module) {
module.namedExport();
});
local-es6.js:
import { dep } from './my/local/module';
export function namedExport() {}
Building addon functionality for this, it should be possible (and not that hard) to support node_modules
recursive handling etc. for full NodeJS support of existing CommonJS modules (there's a CommonJS layer in progress here already - https://github.com/guybedford/systemjs/blob/master/lib/system-format-cjs.js).
It would just be nice to work towards a workflow that really does match what we can expect with ES6 modules.
@joliss yes this does work with the __transpiled
flag ideas completely.
My only comment about the proposal would be if it is worth not having es6
in the name for ES7 support potentially.
I completely see the use case for this, have just been considering some other perspectives to see how it fits in to things.
When working with exports, one could use a module loading system like this if assuming promises:
module-written-in-es6/index.js:
var System = require('es6-module-loader').System;
exports.someMethod = function(args) {
return System.import('./some-method').then(function(module) {
return module(args);
});
}
some-method.js:
import './es6';
export default function() {
// ...
}
Then from outside the module:
require('module-written-in-es6').someMethod(args).then(function(result) {
});
It's one way of getting around the sync / async barrier between ES6 and Node. Working this stuff out too as I go!
Personally, if I were writing a NodeJS app using ES6 module syntax, it would feel very odd to run a build step just to load standard JavaScript modules.
The scenario I've been mostly concerned about is writing a library in ES6, and then publishing it in ES6 on bower, but also transpiling it to CJS and publishing it on npm. If you are writing a Node app, I think you'd generally use Node's CJS module system - having an extra build step just to run your app does seem rather cumbersome. Is this a problem?
If you really wanted to write a Node app in ES6, I guess you could use the transpiler (and it should work fine), but perhaps you'd just use node --harmony_modules
to enable native support? Or you might use the ES6-Module-Loader. In other words, if you don't want the extra build step, then the es6-module-transpiler is out of the picture anyway.
Re publishing libraries on npm, there seem to be two separate issues:
(1) Should non-ES6 Node modules use some special syntax to load ES6 modules? I believe you're suggesting something like this:
require('rsvp').someMethod(args).then(function(result) {
});
Unfortunately, I don't believe that this is really feasible for libraries. When we ES6ify existing Node packages, we'll generally want to keep compatibility for their interfaces. But even for new packages where we don't have to deal with compatibility, I'd worry that a "special" require syntax like the one above is too much of a hurdle. If I write a library that gets published on Node, I want it to have a normal-looking interface, and not be a second-class citizen. For most library authors, I'd guess that in order to adopt ES6, being able to generate normal-looking Node modules is a strict requirement.
(2) Should we publish transpiled modules, or publish real ES6 modules on npm? You seem to be suggesting that we ship ES6 modules, and that seems reasonable intuitively. However, for CoffeeScript at least, the prevailing wisdom in npm land seems to be to publish pre-compiled code. It seems prudent to follow this, unless we have a good reason to do otherwise.
I just worry that we'll end up in a position where ES6 modules become a much harder thing than they need to be.
Are you referring to keeping an easy adoption path to native ES6? I agree that's important. It seems that if/when native ES6 modules are supported in Node, native ES6 modules should be able to load transpiled ES6 modules just fine. Do you think that there might be problems with that?
Yes I agree - when writing an app, a dynamic System
loading approach can be used, or we wait till NodeJS adoption of ES6 modules, or we just use standard CommonJS requires. Having a build step within a NodeJS app was my worry in the previous comment, sorry it seems I misunderstood the primary use case.
Having a module return all methods as promises to allow ES6 loading is also then probably more suitable for application code than a library anyway.
So we kind of need to publish transpiled modules into npm for some time to come, until NodeJS has native support for modules. Polyfilling any other way is not possible due to the asynchronous nature of modules.
Update - sorry I missed out npm post-compilation techniques here. Are these being discussed?
As for interop, it seems to make sense to have two main transpile outputs - AMD and CommonJS. AMD would be for interop typically in the browser, and CommonJS in NodeJS.
The scenario I need to handle is the interop for AMD in the browser, and I'm currently going with a __module = true
flag on the transpiled module to ensure compatibility (as discussed in https://github.com/square/es6-module-transpiler/issues/86). I just don't want to complicate this use case unnecessarily, and would rather get something working sooner.
I'd suggest this proposal can continue to cover the CommonJS transpilation route as necessary.
If there are any issues with the concept of a __module
flag for AMD though, please do let me know.
Here's another problem: Cycles involving re-exports do not work.
Remember, the idea for making imports live bindings is that we re-write
import a from 'A';
function fn() { a };
into
var aModule = ...;
function fn() { aModule.a };
But this still breaks when we re-export imported variables:
import a from 'A';
export { a };
According to the current proposal this transpiles to:
var aModule = ...;
exports._es6Module.a = exports.a = aModule.a;
Note that we are binding to the concrete value of aModule.a
too early. (At least that's what I glean from skimming the spec, 15.2.5.7 ResolveExportEntries ff.)
How should we fix this?
Trying to rewrite identifiers across module boundaries seems like it would introduce a host of other issues.
It seems that we might be able to define an accessor for a
with defineProperty
, but (a) the solution excludes IE8, which will take another another 12-18 months to die -- this might be OK -- and (b) we should test how it will affect performance.
Thoughts? Other ideas?
I had a chat with Yehuda, here's the summary:
Re-exports can obviously change, but relatedly, note that normal exports can change too:
var a = 1
export { a }
a = 2 // changes the "a" export
(1) For default exports, it seems we cannot allow them to change, which means we disallow re-exports, and we disallow assignment down scope.
var a
a = 1 // ok
function fn() { a = 2 } // transpile error - we cannot support this
export default a
a = 2 // transpile error - we cannot support this
(2) For named exports, it seems we can allow them to change by defining an accessor which lazily retrieves the export. However, this breaks IE 8 compatibility, so it's a no-go. (We could in principle rewrite named exports similar to imports and avoid defineProperty
, but this seems tricky enough that presumably nobody will have the time to implement it.) So the easiest solution seems to be disallowing assignment down scope, like in (1).
(3) For default re-exports, we cannot support subsequent modification in Node, but we also cannot stop this from happening, so we should disallow default re-exports altogether.
(4) For named re-exports, we can define an accessor with defineProperty
, so export { a } from 'A'
becomes
var _aModule = ...;
Object.defineProperty(exports, 'a', { get: function() { return aModule.a; }, enumerable: true })
Object.defineProperty(exports._es6Module, 'a', { get: function() { return aModule.a; }, enumerable: true })
This breaks IE8 compatibility, but that seems OK in this case: If you need IE8-compatible output, you simply cannot use re-exports.
One more problem:
The whole reason why we're doing backflips with ._es6Module
is: When two ES6 modules are transpiled into separate Node packages, they need to continue working, by communicating through Node's require
.
But this only works if they are top-level modules in a package. For instance import ... from 'rsvp'
maps into require('rsvp')
. (This works since the module name == the package name == 'rsvp').
But import ... from 'rsvp/all_settled'
would map into, um, require('rsvp/dist/commonjs/rsvp/all_settled.js')
. That kind of require
call seems pretty much unacceptable.
So what do we do? Some ideas:
(1) We could try to somehow expose all the modules ('rsvp/all_settled'
, etc.) on the top-level 'rsvp'
Node module object, via a trick similar to ._es6Module
. This breaks down, however, if the top-level module has a single default export.
(2) When creating the transpiled files for Node, we could create a directory tree:
rsvp.js
└── dist
└── node
├── package.json (copied)
├── index.js (transpiled from lib/rsvp.js)
├── all_settled.js (transpiled from lib/rsvp/all_settled.js)
└── ...
The package is then published with cd dist/node && npm publish
.
This allows us to map import ... from 'rsvp/all_settled'
into require('rsvp/all_settled')
. However, it still requires that the package name matches the module name. One instance where this could break down is a package like ember-metal
, which might expose modules like 'ember/metal'
. Perhaps we would have to inform the transpiler to map import ... from 'ember/metal/observer'
into require('ember-metal/observer')
. It's clearly not optimal.
(3) Perhaps we have to resort to a global ES6 module registry after all. Then if a module depends on 'ember/metal/observer'
and we want to publish it on Node, we'd manually add a wrapper like so:
// Wrapper:
require('ember-metal') // for side effect: registers ember/metal/* modules
// in transpiled code:
... global.ES6_MODULE_REGISTRY['ember/metal/observer'] ...
It seems rather icky, though I haven't thought it through.
(4) Edit: Only allow separate packages to communicate via their top-level modules, so you cannot expect import ... from 'rsvp/all_settled'
to work. This would be sad, and has the same issues as (2) with having to match up package names and module names.
Thoughts? More ideas?
@stefanpenner I know. ;-) The issue I'm raising is, how do you transpile import ... from 'rsvp/all_settled'
(in another library, outside the RSVP code base).
Relatedly, we should talk about whether and how we're planning to make interop between modules transpiled to globals work. Say an ES6 ember imports an ES6 handlebars. Then when both are transpiled into globals, like window.Ember
and window.Handlebars
, they probably need to continue working with each other and preserve ES6 semantics somehow. This is pretty much unsolved as far as I see (we're doing it in an ad-hoc manner for each library), but it seems like a prerequisite to ES6ifying a library like Ember and turning its dependencies (Backburner, etc.) into genuinely separate packages, not just vendored modules.
Or maybe "we can't do it" is an acceptable answer -- but we should make sure beforehand that we understand the implications this will have once we get an increasing number of libraries that need to interoperate.
I am throwing this suggestion for another compilation out there, I may be wrong, but I thought it may be worth a shot!
After playing around with interop quite a bit now for AMD, specifically using ES6 in an ES6 loader (see https://gist.github.com/guybedford/5622c9ed5c9ad4bc0417), I thought perhaps this same method can work for CommonJS here.
Basically the compilation output would be the following transformation:
// we can only ever import existing NodeJS modules as "default" style imports
import fs from 'fs';
// we can import our ES6 modules as we like
import { p } from 'q';
// we import our defaults as we like
import $ from 'jquery';
fs.readFile('x');
export var p = 4;
Would compile to:
var __fs = require('fs');
var __q = require('q');
var __jquery = require('jquery');
// now we get all our modules on the same level
// fs becomes { default: fs }
if (!fs || !fs.__transpiledModule) fs = { default: fs };
// __q remains the same (ES6)
if (!__q || !__q.__transpiledModule) q = { default: q };
// __jquery remains the same (ES6)
if (!__jquery || !__jquery.__transpiledModule) __jquery = { default: __jquery };
// now we extract our named exports
var fs = __fs.default;
var p = __q.p;
var $ = __jquery.default;
// now our code works as required
fs.readFile('x');
exports.p = 4;
// this itself is ES6, so do nice stuff
exports.__transpiledModule = true;
The key thing is the __transpiledModule
property to let us know when we can assume named exports. And then to jump other modules to be of the form { default: ... }
so we can pretend everything is ES6ish already.
Feedback welcome, I may well have missed something!
This is not a bug in the transpiler, but I thought this repo might be a good place to have this discussion:
I want ES6 modules to succeed, because it offers some clear technical advantages. But if we write our modules in ES6, we'll generally want to also transpile them to Node's CommonJS to publish them on npm, so good CJS interopability will be important for ES6 adoption.
But there's an interop problem: Say you have module foo, and it has a single export (
export default Foo
, to be used likeimport Foo from 'foo'
). Right now, this transpiles toexports["default"] = Foo
, to be used likevar Foo = require('foo').default
. This extra.default
is clearly suboptimal, in that it breaks the convention of the existing Node ecosystem. I worry that having.default
will be unappealing and make our transpiled modules look like "second-class Node citizens"."Oh," you say, "but we can simply wrap our transpiled modules in a bit of helper code to get rid of the
.default
." (See the handlebars wrapper for Node for an example.) Sadly, I believe this actually makes things worse: Say another package "bar", written in ES6 as well, hasimport Foo from 'foo'
. When bar is transpiled to CJS, it will say (approximately)var Foo = require("foo").default
. The transpiler cannot know that "foo" is specially wrapped on Node and doesn't need.default
, so now we need need to manually remove the.default
in bar's CJS output. (Am I missing something here?)I've also heard people suggest that Node could simply adopt ES6, so that these troubles would be irrelevant. But switching Node to ES6 is probably not a matter of just enabling the syntax in V8. Rather, the big hurdle is interopability with the existing ecosystem. (Also keep in mind that Node doesn't have a pressing need to switch to ES6.) So if Node is to adopt ES6 modules at all, figuring out a good interop story is probably a prerequisite.
So here are some possible solutions, as I see them:
.default
will be all over the place on Node..default
on CJS. As I point out above, this might get troublesome once we have an ecosystem of packages written in ES6 that need to also play together on Node..default
. (Or add a "Node" mode in addition to CJS that omits.default
.) Soexport default Foo
would transpile tomodule.exports = Foo
, andimport Foo from 'foo'
andmodule Foo from 'foo'
would both transpile tovar Foo = require('foo')
. (If, transpiling to CJS, a module has bothdefault
and named exports, we might throw an error, or require some kind of directive for the transpiler, [update:] or tack the named exports onto thedefault
object, see @caridy's comment below.) This change would acknowledge that.default
is really something you never want on Node. It falls short when modules havedefault
and named exports. (Does this happen much at all?) I believe it also makes circulardefault
imports impossible to support. This is fairly easy to work around though - intra-package cycles can use named imports, and inter-package cycles are very rare.default
as the root-level object, and tack a property like._es6_module_exports
onto it for named exports. See @stefanpenner's comment below.default
as the root-level object only if there are no named exports. Use an._es6Module
property to preserve ES6 semantics. See my comment way below.What do you think about those? Any other ideas?