Closed bmeck closed 7 years ago
Is it appropriate to discuss the ".jsm" extension proposal here? Or does it belong elsewhere? Or has that proposal been definitively rejected?
@jokeyrhyme yes, you can discuss various approaches such as .jsm
, package.json
, etc. here. .jsm
is not rejected, and many still favor it.
I've tried to do a recap of the options for this at https://github.com/nodejs/node/wiki/ES6-Module-Detection-in-Node
Personally, I'm proceeding with the userland solution of editions - with the current implementation of editions, there is import
and require
syntaxes - https://github.com/bevry/editions/wiki/Syntaxes - that could be used for such detection, a directory
property could also be added to each edition, if we go that route.
@bmeck thanks for splitting this up
@balupton lots of those are in the proposal / please avoid putting things that have not been agreed upon in the node wiki as it may confuse people. Feel free to paste all that in here and we will discuss it. I disagree with a lot of pro/con points and discussion should be had on them.
@bmeck point of the wiki page was to summarise all points, not just those agreed upon, so as to avoid having to read the entire previous thread for such depth, it should be expanded rather than truncated - discussion should happen here yes, however I see no wrongdoing with maintaining a comprehensive up to date concise summary of all points
I understand that there are a lot of objections to the idea of trying to inspect the source to detect the format, to the point that in the original thread people keep saying it's a dead issue, off the table. But yet it's still mentioned here, so as long as it's still in the mix I think the export {}
idea I mentioned should be in the picture.
Something like a "use module"
pragma would impose an authoring tax on every file, even though most ES6 modules will already organically include at least one import|export
that unambiguously establishes its type. That'd leave probably just a tiny fraction of modules, like those that are only for side-effects, that require authors to opt-in with some explicit indicator of type. Instead of something invented like "use module"
that could just be module syntax: export {}
. @balupton Per your wiki, with this suggestion in mind I think the implementation side is the huge obstacle, not the authoring side.
@jmm @balupton updated the wiki, it was missing a fair amount of information. Noted things that caused rejections like the pragma "use module". Also linked to proposal to answer "still unclear" points. Please read the file in the proposal before making new statements.
warning, in 2 weeks module proposal for node will be taking file extension unless package.json complexity and workflow concerns are addressed. If those concerns are addressed we will happily delay the decision. I will be confirming against the @nodejs/ctc that this is not a problem in this time as well.
Which one? Or is that still to be determined?
On Thu, Mar 10, 2016, 14:14 Bradley Meck notifications@github.com wrote:
warning, in 2 weeks module proposal for node will be taking file extension unless package.json complexity and workflow concerns are addressed. If those concerns are addressed we will happily delay the decision. I will be confirming against the @nodejs/ctc https://github.com/orgs/nodejs/teams/ctc that this is not a problem in this time as well.
— Reply to this email directly or view it on GitHub https://github.com/nodejs/node-eps/issues/13#issuecomment-195004763.
@isiahmeadows https://github.com/nodejs/node/wiki/ES6-Module-Detection-in-Node#option-4-meta-in-packagejson
Notably:
As per the options, they differ for various ways to put things in package.json
It should be noted that the file extension has a sizable impact as well. It is not with a happy heart I choose between either.
Also, the package.json
thing would be confined to node, and we have several people wanting a simple way to determine goal for non-node projects (many of which do not have package.json).
@bmeck "Is there any plans to support files that use both module systems (aka CJS+ES Modules)?" is specifically about a file that supports both, rather than a package that supports both, here is a contrived example:
import a from 'a'
const b = require('b')
export b
module.exports = a
Another phrasing would be, does each file have to be exclusive es modules, or exclusively node modules, and never mix and match in a single file.
This arose once or twice in the other thread, but I couldn't find anything clear about it, hence why it is there, doesn't seem raised in the proposal explicitly - perhaps is implied somewhere and I didn't notice.
Currently with the babel compiling, mixing and matching module systems in individual files is already being done.
great work with the wiki updates! :dancer:
@balupton that is not supported, @kittens tried that rabbit hole in babel, it is pure zalgo. If you are in the Module goal there is an exports
object as specified in the proposal, but it is a ModuleNamespace. require
continues to exist in the Module goal as well. Is there a reason you want a mixed goal in particular?
@bmeck
Is there a reason you want a mixed goal in particular?
Just raised myself because I know of babel compiled projects that do such mix and matching.
I've found myself having such quarrels too, specially with optionally required dependencies (a dep may only need to be required under special circumstances, e.g. for a specific function call which in the majority of runs will never be called), or with dynamic dependencies (e.g. require(somevar)
) - both currently work beautifully with require.
I subjectively fear losing the js
extension forever. It's iconic. There’s a joke that if you pick a noun there’s going to be a library with a JS appended. If we adopt the extension proposal do we also have to extend the joke to JSM? Some of the technical implications of the extension proposal really scare me, but what about the cultural implications? What about the new developer implications? File extensions are an aesthetic and losing the js
aesthetic worries me.
I think @jmm’s suggestion of export {}
is a really good idea and I agree with him in that parsing should be discussed more. Given their are negative impacts, but the negative impacts of an extension and package.json
approaches are much worse.
A parsing solution does not impact in place systems. In place systems can happily assume CommonJS. An extension approach, however, widely impacts multiple software ecosystems. Whereas a parsing solution only effects new code.
How about file.jsm.js
or file.esmodule.js
- would offer same benefits of extension option without impacting ecosystems as abruptly
@balupton as discussed in the original PR thread, that is not how many toolchains work, they only read the final extension (this includes node
itself).
@calebmer can you discuss the technical implications. There are many problems with parsing, and polyfills are a very common instance of side-effect only modules.
@calebmer in-place systems being unable to use ES6 modules is a massive impact.
@bmeck ok, and I'll address the side-effect only concern and the other CONs listed in the wiki. But first I'd like to offer three observations:
First, we have to remember that no solution can be ideal. The ideal scenario would be that the ES module specification was around at node's conception so that modules could be supported from the start. That didn't happen, so we now have the tricky task of picking a way to add module support to node whilst not breaking the ecosystem. The parsing solution mirrors most closely what node would look like if there was module support from the start.
Second, theoretically the parsing could be as simple as a regular expression matching export
statements. Yes it loses the nuance of a full parser, but a regular expression would be an optimization which could solve for costs on large files and costs for performance critical requires. This also solves for a few other CONs listed on the wiki such as implementation complexity and toolchain detection.
Third, the extension, package.json
, or any other approach may be implemented in parallel as an optimization to alleviate some of the costs. If the extension approach were also implemented I'd recommend, .jsmodule
and .jsscript
to toggle the parsing mode.
Now to counter arguments.
Most of the time a side-effect only module could easily be a script (optionally with a use strict
pragma). If for some reason it has to be a module then, as @jmm suggests, the module could add an export {}
to the file. Then node could identify the file as a module and treat it accordingly.
The intersection of side-effect only files and files which must be a module is very tiny, and the tax of a single line is insignificant and comparable to use strict
pragmas of the past.
Many tools which need to know the difference between a script and a module already have a parser, furthermore many more tools don't even need to tell the difference. For the select few tools which need to know the difference and can't use a small parsing solution from NPM, these tools can use out of band configuration. Take for instance the package.json
proposals, these proposals fit well when it comes to specialized tooling.
Furthermore, tools that can't have a simple parser for NPM often aren't in the node ecosystem (I'm thinking of Ruby, Python, or other ecosystem tools). These tools are dealing with client code, not node code.
Considering Babel and other parsers currently can work with large files I'm not entirely sure why this is a large concern. If we use a regular expression parser and a parallel file extension like I recommend these pains can also be alleviated.
Solutions may also include:
export
statement was found in X characters (this allows the placing of an export {}
at the top of a file to coerce the parser).In a normal project, all dependencies will be imported/required at startup time which is less critical for performance. However, in some scenarios this is not the case. In areas where dynamic requiring must be performant we can look into various algorithm coercing approaches. For current packages we can also look at the package.json
, if it specifies a node version before ES modules we can default to the script goal.
@ljharb sorry for any miscommunication. I meant in-place systems won't be negatively impacted by a parsing solution, a parsing solution would be neutral. I was referring to the .htaccess
and similar configurations argument against the extension approach.
@calebmer one common use case where a parser won't work is Airbnb's rails app - which uses Sprockets, which sends JS files to an entirely different box for compilation. It will have to know what is an ES6 module or not so it can send the right metadata to the compilation service - but it has no JS engine to do so. Also, perhaps you haven't tried reading JSON in POSIX, but "not needing a parser" is pretty critical in some places :-)
@ljharb A couple questions as I don't entirely understand the Airbnb-rails-sprockets-node relationship. Why can't Sprockets choose to implement its own format (maybe extension)? Why can't the compilation service make the detection? If the detection of modules were as simple as a regular expression would this alleviate concerns?
It seems to me that services, like Sprockets, have a little more freedom then node to make large breaking changes. If Sprockets wanted to assume everything was a module or everything was a script it could.
@calebmer the problem is that it doesn't want to assume that - we need to be able to gradually migrate our codebase from script to module. We also don't want Sprockets to implement something that wouldn't work with npm test
, or all our node-based tools, for example.
Do we need to know which mode we are in prior to parsing? Which mode-sensitive use cases do not ultimately involve parsing? Couldn't the airbnb-rails-sprockets-node use case be solved by simply not updating the version of Node.js used until the rest of the required changes (if any) have been made?
I don't have time to respond to everything here right now, but regular expressions are not adequate for detecting this, though like you said a regex or substring search could be useful as a quicker preliminary check to see if import|export
is in the source at all before doing an expensive parse, like detective
does.
Defaulting to CJS if no
export
statement was found in X characters (this allows the placing of anexport {}
at the top of a file to coerce the parser).
I think it's possible that placement in the source could be used to optimize this detection, as I mentioned in my original post:
especially if people are encouraged to place the code early in the source
But making it dependent on that would probably be too fragile, though I did try something like that with browserify for recognizing its own bundles. (That's a much more limited scenario though since it's designed to check its own output.)
@ljharb
We also don't want Sprockets to implement something that wouldn't work with
npm test
What would be an example of that? (Probably makes no difference to the issue at hand, I'm just curious what you mean.)
@jokeyrhyme Yes, because they're different goal symbols and modules are implicitly strict. People might have expanded more in the original thread.
@jmm just responding to the question "Why can't Sprockets choose to implement its own format" - the answer being, because one-off snowflakes are not ideal for being cohesive with the rest of the ecosystem/toolchain.
@balupton as discussed in the original PR thread, that is not how many toolchains work, they only read the final extension (this includes node itself).
Isn't that exactly why file.jsm.js
is awesome? It is exactly because everything (with few exceptions) just cares about the final extension, making it that for everything it is just business as usual - unless of course they are aware of the .jsm.js
convention, in which case they can specifically opt-in to the special handling of it, which is optional and up to them, without forcing anything - which seems to be exactly the point.
Consider the impact of file.jsm
:
.jsm
is a JavaScript file, e.g. Atom.io syntax highlighting, configuring build toolchain to now use .jsm
files, likely with several pull requests to tools in order to do sorequire('package/something.js')
calls.Consider the impact of file.jsm.js
require('package/something.js')
callsThe only toolchain that needs to be aware of the file using ES Modules in this use case in this scope, is node... Using file.jsm
makes everything abruptly aware of that, even when they have no need to, even with unintended consequences - breaking syntax highlighting, requiring .htaccess
files to change, etc. Using file.jsm.js
means business as usual for everything, except node. That to me is very powerful.
It seems forgotten that outside of node people already use ES Modules successfully with the .js
extension, our solution should not have to impose on their existing success and conventions, the .jsm.js
minimises the impact of affecting everything that already works well, with cost impact as minimal as needed, to achieve all the benefits of node needing to know.
Happy to be linked to the places this has already been discussed in case I have missed something. But it seems it solves all of @calebmer's objections to the jsm
extension in https://github.com/nodejs/node-eps/issues/13#issuecomment-195024771 and works well with his points here:
Many tools which need to know the difference between a script and a module already have a parser, furthermore many more tools don't even need to tell the difference. For the select few tools which need to know the difference and can't use a small parsing solution from NPM, these tools can use out of band configuration. Take for instance the package.json proposals, these proposals fit well when it comes to specialized tooling.
Furthermore, tools that can't have a simple parser for NPM often aren't in the node ecosystem (I'm thinking of Ruby, Python, or other ecosystem tools). These tools are dealing with client code, not node code.
Given the sprockets argument... they can just adopt the .jsm.js
detection and be done with it, same cost as the .jsm
extension, less cost as implementing a parser or package.json
sniffer it seems.
Again, the beauty of .jsm.js
is everything would just be business as usual, everything would continue to work as is, without breaking anything, and things that do not yet have a detection method and actually do care about it, can just opt in to the .jsm.js
detection, or a parser algorithm, or whatever they decide is actually best - without node forcing anything on them as the .jsm
extension does.
@ljharb why can't babel be used until a reasonable amount of the codebase has been converted? Also a comment like // @module
could solve this problem. It can signal to Sprockets a different build mode and doesn't break npm test
.
I'm not convinced that there is a big enough need in tooling to know the difference between a script/module and further I'm not sure the "one-off" snowflake detection is a bad thing considering most build tools already have a snowflake configuration format.
@balupton
Isn't that exactly why file.jsm.js is awesome? It is exactly because everything (with few exceptions) just cares about the final extension, making it that for everything it is just business as usual
That's actually a really cool point, but it still has negative precedent setting impacts. What happens when developers start writing file.jsm.jsx
, or other people start adding their own sub-extensions file.jsm.a.b.c.js
? Does node care about the ordering? Would file.a.b.jsm.c.jsx
still parse as a module? What about a jsm.js
file?
What happens when developers start writing file.jsm.jsx, or other people start adding their own sub-extensions file.jsm.a.b.c.js? Does node care about the ordering? Would file.a.b.jsm.c.jsx still parse as a module? What about a jsm.js file?
Good points. Possible ways they could be addressed.
.jsm
extension anywhere, e.g. filename.split('.').indexOf('jsm') > 0
file.jsm.jsx
and file.jsm
.jsm
decorator proposal.jsm.js
extension, e.g. filename.substr(-7) === '.jsm.js'
file.jsm.jsx
and file.jsm
.jsm.js
extension proposalrequire('./index')
could either:
for ( const extension in require.extensions ) { /* check if "${path}.jsm${extension}" exists, otherwise check if "${path}${extension}" exists }
.jsm.json
and .jsm.coffee
, so seems absurd for the cost it introduces.jsm
and .jsm.js
to highest preference in require.extensions
.jsm.jsx
also wishes to be a default, it could also add .jsm.jsx
to require.extensions
as easily as require.extensions['.jsm.jsx'] = require.extensions['.jsx']
- however, the need for custom extensions and jsm coupling here seems quite the exceptionPoint 1.i is nice as it can do a "business as normal" approach to custom extensions like jsx
too, something impossible with the .jsm
proposal. Point 2.ii is nice and simple compared to 2.i, while still allowing custom extensions the ability to opt-in to default loading, something impossible with the .jsm
proposal.
@calebmer
(ruled out below)//@module
seems a lot like the use module;
pragma discussed elsewhere. Either option may still be necessary to clarify weird edge cases, such as files with unclear modes.
@balupton
My own experience with the .jsx
extension is that libraries and tools I used were updated to support it very quickly. Some of the slower moving libraries and tools were simply abandoned. Surely we could expect this with a .jsm
final extension, too, without needing .jsm.js
.
Regarding sprockets, if there is demand for .jsm
then I'm sure it'll be updated quickly.
Regarding .htaccess
and other HTTP server MIME type settings, etc, I imagine we'll still be deploying ES5 code to production for a long time yet, even with HTTP/2. So surely we have enough lead time for the necessary server software to be updated.
Assuming multiple proposals for mode-flagging are not incompatible, why don't we just do everything and see which ones the community embraces? Surely we could enhance Node.js to do the following:
.jsm
extension, treat as a module//@module
or 'use module';
pragma, treat as a modulenode
CLI flags, etc)If it's hard to get consensus around a single approach, is it possible to implement the top 2 or top 3?
guys, please, stop thinking about the pragma detection. we have exhausted that conversation via different channels, the number one goal of ES Modules is portability, and not all environments has the ability to analyze the code before it is parsed and evaluated like node can, and implements will NOT support a parser that mixes modules and sloppy mode scripts. please, stop it, and focus on realistic solutions.
Generally file extensions are contained to 3 letters ( https://en.wikipedia.org/wiki/8.3_filename ) and do not include a .
inside of them. I did some scraping and could only find 1 other IANA file extension that includes an internal .
: .1905.1
. So apparently it is possible to register one, but it would be a very odd duck and lead people to think .jsm.js
is using some kind of transform just like .tar.gz
is a tar file that has been gzipped. Will need to think on this, it doesn't look pretty, but may be viable.
@ljharb you might have opinions on this?
@caridy
guys, please, stop thinking about the pragma detection. we have exhausted that conversation via different channels [...] please, stop it, and focus on realistic solutions.
That's what I thought from the original thread, but it's not remotely clear that this issue which was opened 2 days ago is saying that. This issue appears to invite that discussion by specifically referencing determining the mode given a source string and alluding to the same drawbacks to that concept that you're referencing, that were brought up in the original thread. If that's just your personal opinion, ok then. Otherwise please clarify the messaging.
@jmm it is dead. no parsing. it was dead in the original PR comments as well. The edge cases mentioned and added burden for toolchains is too much.
@bmeck Please clarify that in the OP here. This appears to invite that discussion.
The whole discussion here is due to the fact that we might have components that are mixing CJS and ES Modules, and our estimate (from few of us) is that this is actually a very minor use-case. To help with that, lets try to describe when you need to mix.
When we talk about mixing CJS and ES modules, we are explicitly talking about code in the context or node runtime, and that does NOT include:
i. files used by a runner can have an out of band configuration when invoking node node --module ./path/to/file.js
)
ii. files who are consumed by a tool that enforces certain format (e.g. tests who are evaluated in a context of the test runner, etc).
iii. supportive files that are not intended to be required by other pkgs, in fact, those files are probably not in the npm pkg itself.
iv. an ES Module that contains require()
or module.exports
for some reason or another.
v. fat packages to support other runtimes (a package that contains a transpiled version of the original source for old versions of node, for the browser, for nashorn, for bower, etc.), the reason why this is not important is because this process is a mechanical process that can produce out-of-band configuration for each generated files.
vi. files without package.json
. this is a very edge case, and if your code is not suppose to be shared, you probably have full control over how those are going to be digested by node runtime.
We might have other cases that fits into this bucket, but I think you get the idea. You, as the author the pkg, have full control over how the file is used, and when it is used, it is easy to solve that with an out of band configuration, and we should not care much about that for now.
In the middle of a refactor, developers might end up in a situation where they have part of the module using ES Module format, but still using CJS for some pieces of the package.
In this case, there is another important question: will those modules be in the same folder structure?
note: I haven't seem evidence of mixing files in the same folder when transpiling with Babel and co.
ES Modules are suppose to be a superset of CJS, but at the early stage we might have missing capabilities, things that can only be achieve when using CJS, this will force early adopters to keep some CJS modules in order to achieve certain tasks that otherwise will be impossible by using ES Module format only.
The problem we are trying to solve here is how to signal the format of those files that should be parsed and evaluated by node runtime, while trying to avoid a huge tax on authors. And as today, we have two buckets on the table:
i. detection by path (a decorator on the filename, a decorator on a folder name or a custom extension)
ii. out-of-band configuration in package.json
(e.g.: "module": "path/to/module.js"
)
vi. files without package.json. this is a very edge case, and if your code is not suppose to be shared, you probably have full control over how those are going to be digested by node runtime.
Like 1/2 of times I run node
I don't have a package.json
, not an edge case. Doesn't cover places that use files for config outside of your dir tree like ~/.app/config.js
. Please stop calling it a very edge case. I will disagree heavily on this, and with increasing vigor.
note: I haven't seem evidence of mixing files in the same folder when transpiling with Babel and co.
Most likely since this is green code, the real problem is large existing code combined with the "Missing capabilities in ES Modules". In particular with extenrally mutable exports, circular dependencies, and top level await.
@caridy
guys, please, stop thinking about the pragma detection. we have exhausted that conversation via different channels, the number one goal of ES Modules is portability, and not all environments have the ability to analyze the code before it is parsed and evaluated like node can, and implements will NOT support a parser that mixes modules and sloppy mode scripts. please, stop it, and focus on realistic solutions.
- Portability: ES modules are portable now for people using Babel. With a parsing solution the only change which needs to be made is to stop Babel from transpiling import/exports (along with any other changes in the require algorithm). And I'm not convinced an extension or
package.json
solution maintain portability.- Analysis: How many environments really need to know the difference in the exact same way node does? As I've mentioned before, I think the intersection of environments without a parser and tools which need to know the difference in the same way as node is small. For the small intersection what @ljharb describes as "snowflake" solutions should be sufficient considering these tools already have their own unique configuration formats.
- Mixing: I don't entirely understand this argument, it may be a blocker but I can't be sure at the moment.
The argument for a parsing solution is that it is the best out of many bad solutions. Edge-cases for a parser can be solved, performance hits can be optimized, and there are adequate alternatives for the tooling which needs it. Compare this to the far reaching negative technological and cultural impacts of jsm
and the complete isolation of certain usecases with the package.json
approach.
I want to reiterate that the parsing solution mirrors most closely what modules would look like in node if implemented from the beginning. This is important for developer experience and for our eventual migration to an ES module only ecosystem with the lack of any vestigal structures.
@calebmer the 2 parsers have ambiguities if done on the same source text, that is the edge case. It cannot be solved. Tribal knowledge to unroll ambiguity is not acceptable. End of discussion on parsing from a technical perspective. As @ljharb mentions parsing is prohibitive to toolchains. End of discussion for ecosystem. Do not continue discussion on parsing, if you wish to please comment on the original PR but it cannot produce ambiguity and it must be non-prohibitive to toolchains at the very minimum.
@calebmer
And I'm not convinced an extension or package.json solution maintain portability.
elaborate
How many environments really need to know the difference in the exact same way node does?
all
aside from that, I recommend you to look into all TC39 notes related to ParseModule
and ParseScript
, then talk to implementers, get feedback from them about a unified parsing process that mixes sloppy mode, strict mode and module semantics all together, after all that, if you think it is still doable, let us know! :)
@caridy
detection by path (a decorator on the filename, a decorator on a folder name or a custom extension)
What is a decorator here?
@zenparsing
What is a decorator here?
I believe this refers to something like .jsm
being somewhere in the file path. "Extension" is a very specific place in the file path (the end).
@caridy
I apologise for my part in continuing the pragma discussion.
As for file path decorators and out-of-band settings (package.json, node CLI flag, etc):
We could wait until Node 8 to deprecate the less popular one, and Node 10 to drop it, if we had evidence that a winner had finally emerged.
RE: out-of-band: this does seem to be what the browser folks will be offering, with <script type="module">
. A node CLI flag would cover cases that where a glob in a package.json wasn't appropriate.
@zenparsing decorators:
foo.m.js
or foo.jsm
or path/m/foo.js
or anything else that, by looking at the path, can hint what parser should be used.
@jokeyrhyme asking for both is fair enough, in fact, we have discussed that in the pass briefly. I asked for a mechanism that allow pkg author to hook into the loading mechanism to specify what parser to use per file. This mechanism does not exists today, the only mechanism that exists today is a loader extension which affects the entire process, we will need artifact, per pkg and/or folder, that can be used by node, and whenever a file inside that folder structure needs to be inspected, a function call of some sort will have to be executed, passing the path, as a result, it return the type. something along those lines, which means people will likely create abstractions for the cases where they mix CJS and ES. It might worth exploring it.
@jokeyrhyme all solutions will require the cli flag, which is why there is a note in the issue head about it not being terribly relevant; it is only there to patch a specific case, it does not work as an interop at scale.
@caridy Gotcha.
It feels like both solution candidates have unresolvable issues:
package.json
requires the file system to be "marked up" with flags, and for the platform to traverse the file system in order to gather those flags..jsm
requires cultural acceptance of (what boils down to) deprecating .js
for new code.Seems like a stalemate. Maybe it's time to take another look at the default.js
solution? ; )
@zenparsing not so much a stalemate if concerns about the package.json
approach aren't addressed :-/ it will be much better for everyone if this has consensus, but consensus isn't necessarily required to force the issue.
@ljharb Understood, but forcing the issue would prolly be not so good. : )
Anyway, at the risk of being annoying here's an example of a "fat package" when using the default.js
approach: https://github.com/zenparsing/zen-observable
Regarding my earlier proposal: I've added a note in it to verbosely explain that 1.i would still support file.jsm
, as well as file.jsm.js
, and file.jsm.jsx
, considering recent terminology — it can be considered a .jsm
filename decorator, rather than a .jsm
filename extension
@jokeyrhyme
@balupton
My own experience with the .jsx extension is that libraries and tools I used were updated to support it very quickly. Some of the slower moving libraries and tools were simply abandoned. Surely we could expect this with a .jsm final extension, too, without needing .jsm.js.
Regarding sprockets, if there is demand for .jsm then I'm sure it'll be updated quickly.
Regarding .htaccess and other HTTP server MIME type settings, etc, I imagine we'll still be deploying ES5 code to production for a long time yet, even with HTTP/2. So surely we have enough lead time for the necessary server software to be updated.
I fail to see how this argument is suitable. My point was that the .jsm
decorator proposal does the minimum burden on the ecosystem, while hitting all the goals. Your point seems to be that while the .jsm
extension doesn't do the minimum burden on the ecosystem, it may not be that bad from your experience. I fail to see how arguing against a point of "minimum burden" with "more burden" makes sense.
@zenparsing
.jsm requires cultural acceptance of (what boils down to) deprecating .js for new code.
Not correct, my earlier suggestion means .jsm
can be used without deprecating the .js
extension, and without the need for a package.json
modification, and without the need to force immediate tooling changes.
This is the place to discuss specifics of how node will determine the mode a given a source (file / source string). Source string examples include stdin or
-e
, they do not include JSFunction
oreval
calls. The proposal itself can be seen in PR #3.Discussions here should regard:
Note: CJS will be the default target of
node
for the foreseeable future, "normal" here means what developers write daily. This is a constraint that can be easily worked around for a single source via a cli flag to note that the source is an ES module. This flag should not be seen as relevant to the discussion in this thread.Note: We are only discussing choices that do not require parsing of the source file.
It should not discuss import/export conversion, evaluation ordering, or module path resolution.