automerge / automerge-classic

A JSON-like data structure (a CRDT) that can be modified concurrently by different users, and merged again automatically.
http://automerge.org/
MIT License
14.75k stars 466 forks source link

Publishing the library to npm #370

Closed scotttrinh closed 3 years ago

scotttrinh commented 3 years ago

We've had a long conversation in Slack to discuss the best way to publish the library to npm, and I wanted to provide that context in a more visible public forum.

The problem

Automerge can be run in the browser and also in Node, however, it does use some language features that are not supported by all browsers and node versions. The current situation imposes an implicit minimum required environment by using language features without transpiling. This can lead to some confusing issues and "bugs" by downstream consumers.

There is also an issue with polyfills degrading performance (see #368 and #360 for instance), we need to pick a target environment that doesn't penalize modern targets that support the features used here.

Solutions

The current state of publishing packages to npm has evolved a lot lately. npm packaging support is more sophisticated, node now supports ES modules in addition to CommonJS, and bundling for web applications is more controversial with HTTP/2 becoming ubiquitous.

The two parts of a solution here involve: transpiling the code as-written to a reasonable (and explicit) target environment, and exporting useful module paths (and bundles?) for different environments.

Proposal

I propose that we publish the following artifacts:

  1. A transpiled, unbundled ES module path
  2. A transpiled, unbundled CommonJS module path
  3. (Maybe?) A transpiled, bundled UMD module

That should cover the following environments:

  1. Modern browser consumers can choose to use ES modules directly without a bundling step, or use the UMD bundle if they prefer.
  2. Consumers that wish to support older browsers can use the prebundled UMD.
  3. Node can use either ES modules or CommonJS, depending on the consumers preferences and tooling/support.

Remaining question: What target environment to specify?

We will likely want to keep the existing tooling around transpilation and bundling, so that leads us toward using babel with the babel-preset-env plugin for specifying the transpilation target, and using Webpack to bundle the UMD module.

In order to use the babel-preset-env plugin, we need to specify our targets, and I believe the consensus on a requirement so far is to specify a target that will do the minimum amount of poly-filling and transformation as possible but allows for a nice development experience still. Let's discuss!

scotttrinh commented 3 years ago

My current suggestion for the babel-preset-env list would be:

{
  targets: [
    "defaults", // alias for "> 0.5%, last 2 versions, Firefox ESR, not dead"
    "not IE 11",
    "maintained node versions"
  ]
}

But it would be good to see what polyfills that ends up adding.

skokenes commented 3 years ago

Funny enough, this is already biting me today in a UI project because automerge uses node's builtin string_decoder, which is not available in browsers

scotttrinh commented 3 years ago

@skokenes

Do you think either the UMD module or the ESM module would solve your issue if we set up the transpilation based on the targets I mentioned above?

skokenes commented 3 years ago

Probably?

At the same time, I was able to resolve my own issue with my rollup setup for my UI project. But I did not need to do this with AM 0.14

scotttrinh commented 3 years ago

@skokenes if you have time to share your solution (gist or just a comment here), it might be illuminating for us to ensure that it's a use case that we cover in our testing of this packaging update. Otherwise, we can certainly ping you on the PR and maybe you can test with npm link to see if it allows you to rollback your change.

skokenes commented 3 years ago

Sure thing, we use rollup for this project with the nodeResolve plugin and with AM 0.14 we were just doing:

nodeResolve({
      browser: true,
}),

Which stopped working with AM 1.0 because of string_decoder, changing it to this fixed it:

nodeResolve({
      browser: true,
      preferBuiltins: false,
}),
ept commented 3 years ago

@scotttrinh Could you clarify what "transpiled, unbundled" means? New language features converted to older equivalents, but keeping the original source file structure (i.e. not putting everything in one big file)? If so, I'm not sure what this buys us. What language features are we currently using that are in need of transpiling?

Could you also suggest a concrete configuration to use? I'm not sure how to translate these goals into a working config for webpack or whatever other tools are used.

@skokenes Hmm, it's supposed to use string_decoder only as fallback in case the TextEncoder/TextDecoder APIs are not present. Modern browsers should have TextEncoder, and Node also has it since version 11 or 12 (I think). Since Node ≤10 are now unsupported, we could probably drop the use of string_decoder and rely exclusively on TextEncoder.

ept commented 3 years ago

@skokenes I made PR #381 to drop support for Node 10 and remove the use of string_decoder.

scotttrinh commented 3 years ago

Could you clarify what "transpiled, unbundled" means? New language features converted to older equivalents, but keeping the original source file structure (i.e. not putting everything in one big file)? If so, I'm not sure what this buys us.

Transpiling allows us to not have to think about this next question:

What language features are we currently using that are in need of transpiling?

No idea! 😅 And that's not a sarcastic remark, it points at the underlying problems we're trying to address here:

  1. What are the environments that we support, precisely? (Not "modern browsers and newish node", but a specific set)
  2. What language features are used throughout this codebase and by any dependencies? (I think you would know best, but definitely hard to keep track!)

Another approach that has been discussed on Slack recently is to just add some linting that will fail if we try to use a language feature that isn't supported by whatever targets we specify using eslint-plugin-compat. Essentially enforcing whatever minimum platform targets that we claim to support. That feels like a good compromise since it doesn't require transforming the code at all, but lowers the burden of having to know about language feature support while working on the code.

Could you also suggest a concrete configuration to use? I'm not sure how to translate these goals into a working config for webpack or whatever other tools are used.

Yeah, I think once we get a good direction on the goals, specifically what targets to support and whether or not we care about publishing a web-bundle to npm, I'm happy to help get a PR together that delivers on those goals using the existing tooling we have in the project (eslint, babel, webpack, etc).

pvh commented 3 years ago

Adding eslint-plugin-compat to make explicit what platforms we're targeting seems like a good idea. We've recently adopted eslint anyway, so it seems not too difficult.