Closed KnorpelSenf closed 7 months ago
Just my thoughts...
First point, types aren't runtime code, they reflect how runtime code behaves. If you implementation is actually detects the shape of things at runtime, you are always supporting that code, even when running under Deno or Node, therefore your types should be a super set of that.
Actually swapping out runtime code is a different story. In both oak and acorn I have to swap out runtime code to run under Node.js, Deno and Bun and have created abstractions that are dynamically imported at runtime. Also there are certain polyfills that need to be loaded as well.
In these cases where I am dependent on types that are runtime specific, I have to reimplement the types in the code. You can see that pretty clearly in Bun on acorn.
Things like dnt used to make that easier, and it is a bit of pain at the moment to swap out code just via dynamic imports.
Also things like #179 are also friction in the current process.
If you implementation is actually detects the shape of things at runtime
It doesn't. If you take a look at grammY (linked in original post), it ships a different implementation of InputFile
per runtime. There are essentially two different packages. If you install grammY from npm, you get the Node implementation of InputFile
, if you use it from Deno/x
, you get the Deno implementation.
grammy Uses deno2node to achieve this by building the Node package swapping out *.deno.ts
imports for *.node.ts
imports; but really the question here is, how do we do either of these with JSR:
With oak prior to JSR, I was using dnt which does effectively what deno2node does, and there were two things that dnt did for oak that I had to rework to get it to move to JSR:
While it isn't super low friction at the moment, I was able to accomplish both while and ended up dropping dnt for now. The first was accomplished by simple feature detection and the second was accomplished via dynamic imports.
While the DX isn't as straightforward and easy at it could be, it is possible.
If you implementation is actually detects the shape of things at runtime
There is no way how we can detect at runtime whether the passed argument is of type Deno.Reader
if the code is run by Node.js. (Unless we first shim Deno.Reader
for Node.js which would be pretty pointless.)
Swap out runtime modules
So your recommendation is to ship all code for all platforms and rely on platform detection to switch implementation and for types to be a superset of all platforms, is it?
the DX isn't as straightforward and easy at it could be
For the sake of discussion, what would you imagine the easier DX would look like? Perhaps that'd make a case for something JSR could implement.
the second was accomplished via dynamic imports.
Since I want to export the function, this would cause slow types, right?
I will make the obvious suggestion that nobody (including me) will like.
// jsr.json
{
"name": "@luca/greet",
"version": "1.0.0",
"exports": {
"deno": "./mod.ts",
"node": "./mod.node.ts",
"bun": "./mod.bun.ts",
}
}
I don't like it because it restricts the package to having one entry-point. I have packages that would like to expose @scope/pkg/file.ts
in addition to the primary entry-point, for example.
Yeah I'm aware that this exact syntax would clash with entrypoints, it isn't really an option. Also, the promise of JSR is to remove the complexity around targeting multiple runtimes. Adding more complexity to the configuration torpedoes this a bit.
Swap out runtime modules
So your recommendation is to ship all code for all platforms and rely on platform detection to switch implementation and for types to be a superset of all platforms, is it?
At the moment, yes... as that is what works with oak. Yes, code that never gets loaded gets sent as part of the package when installing via the npm eco-system, but they won't get loaded unless the runtime needs them.
For the sake of discussion, what would you imagine the easier DX would look like? Perhaps that'd make a case for something JSR could implement.
My thoughts are already in #179 about how to improve the multi platform DX in general. I love the simplicity that dnt offered in making those two points easy of poly filling and module substitution.
the second was accomplished via dynamic imports.
Since I want to export the function, this would cause slow types, right?
I haven't experimented with exporting dynamic imports. Having different API surfaces feels like always a problem. For example @std/path
has to swap out runtime implementations depending on runtime OS, but you keep the API surface consistent.
While it might seem a bit counter intuitive, having an implementation that supported both fs.ReadStream
and Deno.Reader
would ultimately be better, because for example on Deno, it is quite trivial for someone to use a fs.ReadStream
like API. As there are an increasing number of n runtimes out there that have various levels of interoperability, assumptions of what is "actually" supported on the runtime can be wrong (and not age well).
Having different API surfaces feels like always a problem.
I agree. I am currently thinking that this would probably be the thing that I have to change about the library if I wanted to support JSR. We have already unified the API surface significantly in the past, and we might have to go another step now.
Perhaps the JSR take on the cross-runtime topic needs to be “one package, one codebase, many runtimes” but this still feels like a time bomb. I might be able to unify the builds for my packages, but sooner or later there will be people building stuff that can't do this. I hope I'm wrong, though.
I will make the obvious suggestion that nobody (including me) will like.
// jsr.json { "name": "@luca/greet", "version": "1.0.0", "exports": { "deno": "./mod.ts", "node": "./mod.node.ts", "bun": "./mod.bun.ts", } }
I know this is the current entrypoints feature, but something similar could be made for runtimes. JSR could understand where the package "starts" for each runtime and, in the same way it transpiles TS for node, provide the correct main file for each runtime. That way mantainers don't have to worry about detecting runtimes, nor swapping code, they could do something like:
// jsr.json
{
"runtimes": {
"deno": "main.ts",
"bun": "main.bun.ts",
"node": "main.node.ts"
}
}
Or, yet:
// jsr.json
{
"runtimes": {
"deno": {
"/": "main.ts",
"/types": "types.ts"
},
"node": {
"/": "main.node.ts",
"/types": "types.node.ts"
}
// ...
}
}
That would allow support for multiple entrypoints on multiple runtimes while keeping imports identical between runtimes, meaning end-users would also benefit from this.
And then deliver different builds based on the user agent header?
And then deliver different builds based on the user agent header?
Precisely.
You're all getting awfully close to conditional exports here. I don't want to introduce those to JSR - conditional exports are too complex for users, and are a worse way of doing conditional imports than just doing it in code using if
statements.
My take on this:
if
statements or ternaries to load code conditionally. Bundlers are very smart and, if configured to do so, can elide code that is not relevant to a certain platform.Disappointing. I imagine JSR already delivers different bundles for Deno and npm-like clients; a little more control over the process might have supported this usecase much nicer.
I will write a blog post on this one day, but "conditional exports" (or more recently "conditional imports") are, and always will be, a bad idea. You should express conditions in code. Conditions in metadata files essentially bring us back to the days of user agent sniffing in browsers, where instead of checking if fetch
exists globally, and polyfilling it if it doesn't, we are polyfilling fetch
(or node:fs
or whatever), regardless of if a runtime supports it. This is bad, and hinders progress in runtimes.
(Note that this is somewhat pseudo to make a point, here are the actual types for Node and Deno.)
In this very specific case, you could consider using web/JS standards only; though it would be a breaking change, I don't think it would be a big inconvenience in the API.
e.g. If you change your signature to
| string
| Blob
| Response
| URL
| URLLike
| Uint8Array
| ReadableStream<Uint8Array>
| Iterable<Uint8Array>
| AsyncIterable<Uint8Array>
It's already going to be broadly compatible with almost anything. It wouldn't accept Deno.FsFile
nor Node's fs.ReadStream
directly, but instead of passing in such a file handle fh
directly, users can write fh.readable
(Deno) or fh.readableWebStream()
(Node 17+). (There is a subtle difference that one stream will auto-close the file handle while the other won't but I don't think that's a big deal in this case.)
Edit: Actually it occurs to me that you can also accept the types | {readable: ReadableStream } | { readableWebStream: () => ReadableStream }
and automatically call the stream converter at runtime. Then your API will accept Deno.FsFile
and Node's FileHandle
directly, with proper typing, but no need to runtime-specific types. (Supporting Node's stream.Readable
in a similar manner would be slightly trickier but also doable, though users can also just wrap it in Readable.toWeb()
.)
There is no need to specify ReadableStream
or import("node:fs").ReadStream
, since they're AsyncIterable
.
And if URLLike
was changed to { href: string }
, it'd match URL
, instead of accidentally matching Response
: https://github.com/grammyjs/grammY/pull/434
If ("Deno" in globalThis) {
// @ts-ignore Deno-only
const file = await Deno.open("path")
return file.readable
} else {
const fs = await import("node:fs")
return createReadStream(path)
}
bring us back to the days of user agent sniffing in browsers
Actually, JSR delivers npm-like packages over npm.jsr.io, does it not? So there's no need for UA-sniffing; we'd just need a little control over what files are delivered for the different endpoints. I disagree that you should ship all code for all runtimes and have a dynamic switch at runtime. If you can know ahead of time that you will be shipping dead code, you can always choose to avoid this.
Bundlers are very smart
We don't want to start off shipping code designed for bundlers and crippled for native web imports. So essentially what you're saying by that is, "We completely discourage importing from JSR in the browser. Always have a build step."
bring us back to the days of user agent sniffing in browsers
Actually, JSR delivers npm-like packages over npm.jsr.io, does it not? So there's no need for UA-sniffing; we'd just need a little control over what files are delivered for the different endpoints. I disagree that you should ship all code for all runtimes and have a dynamic switch at runtime. If you can know ahead of time that you will be shipping dead code, you can always choose to avoid this.
I think the point is not everything that consumes packages over npm.jsr.io is the same runtime, and the analogy of browser UA sniffing is that it is folly to make assumptions about the runtime's capabilities, because any assumptions made today about what a runtime can do will invariably be invalid in the future.
Bundlers are very smart
We don't want to start off shipping code designed for bundlers and crippled for native web imports. So essentially what you're saying by that is, "We completely discourage importing from JSR in the browser. Always have a build step."
Again, the point is dynamic imports work effectively, but if shipping all the possible code to the runtime is a concern, bundles are smart to optimise that code for you. Importing 100+ modules into a browser is inefficient, even with the promise H2, the reality is that you build for browsers, not because of their capabilities, but the need to be efficient over the wire, and that is unlikely to change. There are those who still want to tilt at those windmills, and yeah, loading directly from a package repository should still be supported.
I will make the obvious suggestion that nobody (including me) will like.
// jsr.json { "name": "@luca/greet", "version": "1.0.0", "exports": { "deno": "./mod.ts", "node": "./mod.node.ts", "bun": "./mod.bun.ts", } }
Yes, it is already the way a lot of packages do it.
/es
for EcmaScript modules imports, /deno
for Deno types, /clients
for browsers...
Reading this discussion it seems there is the general question if JSR packages should support platform specific code in the first place. If yes, there should be some way to control it. If not, stuff like Deno.open
or node:fs
shouldn't exist on JSR.
Currently, being able to (or forced gamified to by the score) specify platform support for JSR strongly suggests that the first is wanted by JSR.
Also, with @ts-ignore
there is not much point in trying to detect such code as people will find ways to work around such detections.
In order to be platform-independent code that uses only ES specifications (like Uint8Array
over the Node.js specific Buffer
) would be best but not all things are easily doable that way. Thinking about the example of the InputFile
by a given path… Either the user experience is worse as Deno.readFile(path)
needs to be specified all the time. Or platform specific adapters need to exist but can not be put at JSR (as they are platform specific). Then platform independent code from JSR could be used with platform specific adapters. Sounds doable but horrible.
But this problem is not even a problem for backend runtimes as issues like #164 suggests…
I agree that inline conditions and dynamic imports can solve many simple cases, but there are other cases, where you'd want to switch one implementation completely for another based on runtime or some other environment consideration, and in such situation you don't want the baggage or risk of every possible implementation being dragged in.
Import maps provide a neat way to shift this problem entirely out of the code path. In an application you can abstract away your runtime/environment specific variants to a set of modules, and utilise the import map to remap to one of these variants. Then at runtime the unused variants are completely out of the picture.
JSR could potentially support publishing/building of package variants based on variant import maps, where this kind of isolation is more desirable than the inline condition or dynamic import.
There unfortunately is no official spec for import map composition though, but I think the simplest case of a common import map and a single variant override could suffice. The issue then would be how to distinguish between the variants during module resolution, I'm not going to suggest a solution to that here, but I don't think it's insurmountable.
I think without this kind of first class support for variants, we'll find many packages working around it in various differing manners, maybe publishing separate packages: @scope/foo-deno
, @scope/foo-bun
, etc. or providing a sub-package per runtime: @scope/foo/deno
etc.
I would be great if JSR nailed this from the outset rather than compromising as an afterthought or leaving it up to the ecosystem to diverge on their own solutions.
I think this has been answered.
TL;DR: Publish a single code base, branch at runtime. Do not detect the platform, just check for the existence of the respective functions.
Thanks! Closing.
I need to resurrect this as I'm struggling to get condition export to work. My case is a library to add HTMLRewriter
to runtimes that do not support it.
Only Bun and Cloudflare Workers have HTMLRewriter
built-in, and the rest need the implementation.
The naive approach as mentioned above is to check for existence of globalThis.HTMLRewriter
and shim it when it's not.
Unfortunately, when doing so, the shim code is still run by wrangler
/Cloudflare Workers, failing the build as the shim depends on html-rewriter-wasm
, which does not prefix Node's built-in modules with node:
.
I got it to work across runtimes with condition exports defined in package.json
, but jsr.json
only lets me define one.
Context
I currently build my library in two different ways, once for Deno and once for Node. Since my library is able to work with files from various sources, the Node build should accept the Node-specific
Buffer
objects. It also still supportsDeno.Reader
(although this might be dropped for Deno 2).For Deno, it makes no sense for me to support the old Node-specific things from the Node compat layer (I only want to support web standards and/or Deno-specific things). For Node, I cannot support the Deno-specific things.
In the end, I have a function that is something like this:
Question
I need a way to export two different type signatures, depending on the platform from which my library is consumed.
Is JSR for me? If yes, how can this be done?