Open mathiasbynens opened 10 years ago
My thought was that you author regexpu in ES6 format, with a prepublish step to convert it to CommonJS and/or AMD. Then when people do npm install, they get
dist/regexpu.amd.js
dist/regexpu.cjs.js
lib/regexpu.js
Traceur then consumes lib/regexpu.js
, whereas other people consume their preferred format.
You could also do this the other way around---author in CommonJS, then transpile to AMD and ES6---but I do not know of any tools to do the conversion in that direction, whereas the ES6 -> other formats direction is well-established.
I agree with @domenic that the es6 -> commonjs is the solution we have been using and it would unblock #1294. Being able to load node modules and emit our precompiled code would be great. Surely we have the technology. We just need to know the details for node modules. Some reader here surely does.
@johnjbarton an alternative approach, that would be more invasive to Traceur, would be to compile Traceur using (Traceur - 1)'s compile-to-CommonJS output, and then browserify the resulting CommonJS output. That is what I do in html-as-custom-elements, because it then allows me to use both require
(for Node built-ins and CommonJS packages) and import
(for ES6 files within my project, and ES6 packages like webidl-html-reflector). So with this approach Traceur could use code from any package that was authored in either CommonJS or in ES6.
We just need to know the details for node modules.
For CommonJS / Node modules it boils down to this:
// To export an object with `foo` and `bar` properties:
exports.foo = foo;
exports.bar = bar;
// Or:
module.exports = {
foo: foo,
bar: bar
};
// To load an installed module’s exports:
var x = require('module');
var foo = x.foo;
var bar = x.bar;
// To load a local file’s exports:
var y = require('./path/to/file'); // extension optional
// If no exports are set, but the file/module is JSON data, then load the data and `JSON.parse` it
var object = require('./path/to/file.json'); // extension optional
// Note: regexpu doesn’t use this, but it might be useful to support this anyway in case other future CommonJS Traceur dependencies do
That would need to be rewritten into the ES6 format as used by Traceur.
If such a transpiler can be added to the Traceur workflow, it means every npm package can easily be integrated. Requiring third party modules to be authored in ES6 could work too, but it might be a hard sell (especially since their dependencies should then be authored in ES6 as well).
I think we need to:
1) transcode functions named require
to imports with conversion of the
path considering the require()
directory traversal logic.
2) Apply the transcoding recursively at compile time.
3) transcode each of exports formats into our precompiled format.
4) add a global require() function that throws to cover rare cases where
require() is aliased.
5) throw error if require() appears in a conditional or where ever
import
is not allowed.
On Wed, Aug 27, 2014 at 9:41 AM, Mathias Bynens notifications@github.com wrote:
We just need to know the details for node modules.
For CommonJS / Node modules it boils down to this:
// To export an object with
foo
andbar
properties:exports.foo = foo;exports.bar = bar;// Or:module.exports = { foo: foo, bar: bar}; // To load an installed module’s exports:var x = require('module');var foo = x.foo;var bar = x.bar;// To load a local file’s exports:var y = require('./path/to/file'); // extension optional // If no exports are set, but the file/module is JSON data, then load the data andJSON.parse
itvar object = require('./path/to/file.json'); // extension optionalThat would need to be rewritten into ES6.
— Reply to this email directly or view it on GitHub https://github.com/google/traceur-compiler/issues/1295#issuecomment-53602665 .
None of that seems necessarily to me. Instead regexpu should just distribute ES6 format, and then Traceur's code can import
it directly.
That's a fine solution to #1294. This issue is about importing npm modules. Maybe we should force developers to recode their modules but an import strategy would seem more convenient.
On Wed, Aug 27, 2014 at 10:16 AM, Domenic Denicola <notifications@github.com
wrote:
None of that seems necessarily to me. Instead regexpu should just distribute ES6 format, and then Traceur's code can import it directly.
— Reply to this email directly or view it on GitHub https://github.com/google/traceur-compiler/issues/1295#issuecomment-53607489 .
@domenic, you can't expect all modules to be written in ES6, and the ES6 loader is designed to load AMD and CommonJS modules for backwards-compat. Adding a Node-style resolution process into the Traceur ES6 loader should be relatively simple.
@guybedford
you can't expect all modules to be written in ES6
right, for the modules written in ES5, you would use require
.
@johnjbarton
This issue is about importing npm modules.
in that case I would suggest using require
for such modules. Then you need a tool that allows you to package files with require
statements for use in the browser. Browserify is the best of these (as it handles the module resolution algorithm, works with dirname/filename/process semi-globals, allows require("url")
and other similar things packages commonly do, and has an ecosystem around it).
Consider things like running static traces or how to make code like that work in a browser - it's effectively making a new module format. The ES6 browser loader would struggle to support this as well.
So I don't really think it's a pattern we want to propagate for these reasons. But I may well be wrong, and it's important to discuss.
On Wednesday, August 27, 2014, Domenic Denicola notifications@github.com wrote:
@guybedford https://github.com/guybedford
you can't expect all modules to be written in ES6
right, for the modules written in ES5, you would use require.
@johnjbarton https://github.com/johnjbarton
This issue is about importing npm modules.
in that case I would suggest using require for such modules. Then you need a tool that allows you to package files with require statements for use in the browser. Browserify is the best of these (as it handles the module resolution algorithm, works with dirname/filename/process semi-globals, allows require("url") and other similar things packages commonly do, and has an ecosystem around it).
Reply to this email directly or view it on GitHub https://github.com/google/traceur-compiler/issues/1295#issuecomment-53628022 .
This would be my suggestion:
node_modules
folder, or step down a directory and check the node_modules
folder there.There are some subtleties around the detection process.
In terms of builds, we have a way of building CommonJS and ES6 together with System.registerDynamic
(see https://github.com/systemjs/systemjs/commit/5de7489c8872c59db32fc24f7f08553b90b1ce3d#diff-cee192c68839dc363c2617d94c76f65dR105).
Apologies if I'm imposing my views on this thread though, just tell me to shut up if appropriate.
(I thought I replied to this already... don't know where that message ended up.)
In Traceur we have a solution/hack for this already. Our source maps solution depends on Mozilla's source map code which uses universal module wrappers. Take a look at https://github.com/google/traceur-compiler/blob/master/src/outputgeneration/SourceMapIntegration.js-template.js and how we inline these files using the following build rule: https://github.com/google/traceur-compiler/blob/master/Makefile#L244-L245
Used the same hack here, for now: https://github.com/google/traceur-compiler/commit/27e9cf2600ed041ce1eb02b23790e4e6fa015163#diff-7
In an ideal world you would be able to mix ES6 modules and CJS modules when developing with node/npm. I raised this issue with SystemJS. In essence I've written some ES6 packages and I decided to try and see what it would be like to not build them as CJS but just import them from another ES6 package.
This is proving to be a bit painful as I will need to configure the loader with the node_modules
path for not only the ES6 packages but also the CJS ones that worked just fine when used in the child package.
This does seem a workflow that people will want and it shouldn't require reams of configuration. All most devs would need is the normal node_modules
resolution algorithm.
The tests in my child repo work just fine without configuring the SystemJS
loader. So I import my ES6 modules and they require their CJS modules and everything works perfect. Once I try and consume this package within another ES6 package everything gets complicated.
The workflow is almost there. It would allow transparent mixing of ES6 modules with CJS and no complex loader configuration.
Hacked a workaround in.
module.paths.push('/home/brian/dev/global-compiler/node_modules');
Inside the ES6 module that requires CJS modules, of course that's not portable and I can't seem to access information that would allow me to programmatically generate that file path.
__moduleName
isn't accessible to me and process.cwd()
isn't enough, __filename
and __dirname
are the ES6 Module Loader directory. The lack of programatic APIs around CJS is a bit limiting. Shows the value of the ES6 module pipeline, even just access to normalize would be enough.
The other way to work around this issue, which I think I will use instead for this small project is to install all the npm packages required in the child package in the parent package so the require
resolution will find them.
I’ll quote @johnjbarton (source):
This blocks #1294.