nodejs / NG

Next Generation JavaScript IO Platform
103 stars 12 forks source link

Versioned core modules #9

Open domenic opened 9 years ago

domenic commented 9 years ago

Part of the point of the NG brainstorming is to see how we can make backward-incompatible changes to core.

One early proposal was to use ES6 modules as a "switch": const fs = require("fs") gives you the existing fs, and import fs from "fs" gives you "fs v2" with back-compat breakages. However, this suffers from the "this time we'll do it right!" fallacy, and assumes we won't ever want to break back-compat again. (Or at least, it doesn't have a strategy for doing so.)

I think a better approach would be to start versioning the core modules. The version numbers that are used in code would be a single number, i.e. 1 or 2, not 1.3.4; we only need to signal back-compat breakages, not additions or fixes. There are a variety of ways you could envision this working then:

(I think I like the second better; however, it does mean we block progress on this idea until ES6 modules ships in V8, and that could be a long time.)

Relatedly there comes the question of how we distribute these modules. Here is one scheme that I like:

mikeal commented 9 years ago

One early proposal was to use ES6 modules as a "switch": const fs = require("fs") gives you the existing fs, and import fs from "fs" gives you "fs v2" with back-compat breakages. However, this suffers from the "this time we'll do it right!" fallacy, and assumes we won't ever want to break back-compat again. (Or at least, it doesn't have a strategy for doing so.)

This isn't entirely accurate. import fs from "fs" or import fs from "core" could give you the version of fs, or the version of stdlib, noted in your package.json.

There are some technical limitations we should acknowledge. I don't think it's practical to ship with every version of every stdlib module, especially if we want to continue bundling them in the way we do now. If we decide to ship with a single version then we also need to acknowledge that many core modules depend on each other.

domenic commented 9 years ago

This isn't entirely accurate. import fs from "fs" or import fs from "core" could give you the version of fs, or the version of stdlib, noted in your package.json.

Ahh, that makes much more sense, now I understand.

If we decide to ship with a single version then we also need to acknowledge that many core modules depend on each other.

Yeah, that's the tricky part. The import { fs } from "core" idea (or maybe import fs from "core/fs") sidesteps it quite nicely.

chrisdickinson commented 9 years ago

This isn't entirely accurate. import fs from "fs" or import fs from "core" could give you the version of fs, or the version of stdlib, noted in your package.json.

Treating core modules like npm modules feels like an overapplication of an abstraction that works in one case to another case where it fits uncomfortably (sort of a "when you have a hammer, screws are just difficult nails" situation.)

Core modules are the only modules in io.js that can declare a global abstraction that is invariant across packages. That means that those abstractions are safe to return or accept across package boundaries. For example, http requests, event emitters, streams, crypto hashes, etc, are all okay to accept or return from your module.

Userland is not able to make these guarantees – the classic example is voxel.js's issues with vectors, which was solved by downcasting parameters and returns to a shared type (array) between packages, before upcasting back to "ndarray" internally. It's also seen in http middleware – adding attributes to an existing type creates a new type, of sorts, which at best, silos a package's use to a given framework ecosystem, and at worst, ends up in peerDependency hell.

Allowing userland to "pick" core library versions means giving up the ability to make global, cross-package abstractions in core. That means that global, cross-package abstractions will be difficult to enforce anywhere in the ecosystem, even when they start to become necessary.

I worry that versioning "core" runs into the same problem, also – when you require('url@2'), you're using the second version of the url parser. Can you pass the result of url.parse to http.request? What if http.request is using url@1? What if you don't control the version of http in use, because you're passing the url object to a different package?

timoxley commented 9 years ago

minor point but the @ sign will make make imports look like a syntax shitshow when combined with npm's scoped packages syntax:

import express from "@strongloop/express@2"
timoxley commented 9 years ago

Being able to specify aliases/mappings up in the package.json could solve this, along with a bunch of other issues related to being tied directly to filesystem for dependency lookups:

{
  "aliases": {
    "url": "url@2",
    "express": "@strongloop/express@2"
  }
}

Another alternative could be to attach versions to the core namespace:

import { http } from "core" // current version
import { fs } from "core@3" // previous
import { url } from "core@2" // legacy io.js
trevnorris commented 9 years ago

+1 on @timoxley suggestion. I like the '<module>@<version>' syntax, and seems most intuitive.

iamstarkov commented 8 years ago

One negative outcome from dependency defined as fs@2, that you need to rewrite all code which are using this stuff, once you decided to upgrade NodeJS in you environment, from this perspective is useful to have aliases in centralized package.json

iamstarkov commented 8 years ago

Other thing is bothering me, even if you can control your own code, you can't do the same for 3rd party node modules. For example i used some lib decided to use fs@2 which became usual fs in next major nodejs version. So all lib dependents will be unable to update to new major version until fixes in the lib itself.

In this case consumers can benefit a lot from package.json aliases, if aliases in the app root can override lib aliases, and it seems like natural approach because app can be hosted only with one nodejs version at the same time.

In case if app can be run in different nodejs versions then aliases idea can be extended with aliases binded to nodejs versions.

{
  "aliases": {
    "v5": { "url": "url@2" },
    "v6": { "url": "url" }
   }
}
mikeal commented 8 years ago

For example i used some lib decided to use fs@2 which became usual fs in next major nodejs version. So all lib dependents will be unable to update to new major version until fixes in the lib itself.

This isn't an issue because require('fs') won't be changing. About a hundred thousand modules already rely directly or indirectly on this and we won't be breaking them all any time in the future. That's why we need something like require('fs/2') in order to evolve a new API without breaking compatibility.

iamstarkov commented 8 years ago

That's why we need something like require('fs/2') in order to evolve a new API without breaking compatibility.

it will become a problem for npm ecosystem when evolved fs/2 will become back normal fs and previous fs become obsolete in following major nodejs update, that's why I prefer aliases approach — it allows to avoid this problem