Closed brainkim closed 3 years ago
For Deno, you don't need npm at all. Having it located somewhere on a CDN which the .ts
code is supported directly. An example of a rich library would be something like oak.
The biggest "problem" with dual support would be that modules have to end in the extension in Deno. Also Node.js doesn't support TypeScript directly, and doesn't support ESM (without enabling experimental modules). So you would have to have a build a version that would work frictionlessly in Node.js and have that be your distributable to npm. I'm personally not aware of anyone who has created a good build script for Deno .ts
to CommonJS .js
, but it should/could be possible fairly straight forward. The builder would need to provide a Compiler Host that resolved modules in line with Deno's module resolution and also strip out the .ts
extensions on emit.
If you are using Deno
namespace APIs, almost the reverse of #2644 will be needed, almost a "browserify" layer that translates Deno APIs to Node.js ones.
Ahhh yeah the big issue is the requirement of file extensions for deno. I like your suggestion of moduleResolution: "literal"
(preserve
?) in the related typescript issue. If the typescript team is gonna stick with following the node module resolution algorithm, maybe deno can offer a competitive alternative to tsc
which can compile to node compatible code?
Regardless of how successful deno becomes, I see dual-supporting both node and deno as pretty much necessary for most library authors.
Related: https://github.com/Microsoft/TypeScript/issues/27481
@brainkim regarding the modules resolution, there is also: microsoft/TypeScript#33437 that would be more of an ideal, but getting Node.js to support it... Though Node.js is moving towards extension required modules under the experimental ES support, there would still be a need to change the extensions on the emitted code, and if TypeScript (tsc
) touches it, they will get it right for some and wrong for a lot of others.
I think for the purpose of creating "isomorphic" Deno libs and CLIs it makes sense to implement sort of Node.js polyfill for all the Deno's built-in APIs, then bundle your tool using deno bundle
and after that prepend it with the require
from the polyfill, like
const {Deno, TextDecoder, TextEncoder, ...} = require('deno-polyfill');
/// deno generated bundle
probably it also makes sense to convert AMD modules in the original bundle to commonjs
allegedly this is what @YounGoat's deno package on npm is supposed to do... https://www.npmjs.com/package/deno
Okay so deno bundle
is a piece of the puzzle that I wasn’t aware of. According to the website, deno bundle
is a tool which compiles your deno code to AMD (😬). Does this mean I can use typescript or webpack to compile to amd as well and have deno modules consume this?
it makes sense to implement sort of Node.js polyfill for all the Deno's built-in APIs,
This is interesting, but I’m also interested in a simpler use-case where I use 0 deno builtins and 0 node builtins (pretty much vanilla typescript).
@brainkim Deno can consume a single JS file or ES modules if you need imports. It doesn’t know about AMD except as an output from the bundle command.
(The AMD bit is really throwing people off - we should modify deno bundle to use https://github.com/denoland/deno/blob/master/deno_typescript/amd_runtime.js to execute the main module.)
@ry we talked about it for Bundling v2 (#2475), but it does feel like embedding the runner directly in the bundle so it is a totally self contained script is warranted and knock a few features off the v2 list. That way we can just say "it generates a standalone script" and let people forget about the implementation.
I can start working on this.
The whole thing would be easier if deno bundle
is outputting a better format. I'm curious why AMD is chosen instead of UMD? Or at least it would be pretty trivial to allow users to output different format, through flag?
@tunnckoCore again, people are obsessing about something they shouldn't obsess about. 😁
UMD specifically would be useless. That is, depending on the implementation, 2 different module formats and an ability to load as a global script, but when you only need one module format that is loaded into an environment that doesn't support CommonJS for good reason.
If you really want to know the reason though, it goes something like this:
.js
file.So we were left with the scenarios of:
We opted for the last. The only value in the first two is to be able to say we weren't using AMD, which seems like a very odd thing to do.
AMD came into being for some valid good and valid reasons, and ESM doesn't solve all of them. There are very good reasons why ESM isn't concatenatable, and I respect that, but it does mean in a lot of cases, we will be stuck with some form of bundlers in perpetuity (and likely AMD).
@kitsonk I would consider System.js as a bundle output as well. I think typescript supports it as an output (though last I checked support is not great and System.js can be tricky to set up), it’s more closely aligned with ESmodules, and System.register
calls can be robustly concatenated. It just feels more future-proof/standards-aligned IMO.
But most of this is unrelated to this issue, which is, given a pure typescript codebase without dependencies, how do I add support for deno consumers? The two approaches would be:
babel
can help us here?d.ts
pragma to find the typings. Also I need some guidance about what deno as a compilation target looks like. Do I need to compile to amd require modules? Can I use UMD or System.js? Should modules be concatenated/minified?@brainkim , last week or two I'm thinking about the second too.
Write Nodejs (js/ts), so need for deno vscode/typescript plugins. Output esm and cjs formats - e.g. dist/index.js
and dist/esm/index.js
, and put the typings files on the dist/esm
(e.g. dist/esm/index.d.ts
), so later when Deno users want to use your module then will use some CDN (preferable UNPKG), like https://unpkg.com/@scope/pkg?module
. Not sure if Deno will pick up the separate index.d.ts
file?
import koko from 'https://unpkg.com/@tunnckocore/kokoko?module';
koko('sasas', 200);
// should error, because both arguments must be `number`
I have two dummy modules, that I'm playing with: @tunnckocore/kokoko and @tunnckocore/qwqwqw check them on unpkg.
The interesting thing is that UNPKG is helping us when using the ?module
query param. The only problem, I think... (cc @mjackson) is that it strips the sourceMapping comment when using this query param. I'm not sure if this is a mistake or it should be that way, but anyway.
Ha... I already wrote to that issue, haha.
Just thought about another idea that spawned after looking around the other "compat" issues and discussions, mainly triggered to look on them today because just seen that denolib/node
was archieved with saying it's merged into Deno. So, first, what's that, what that mean?
And second, the another possible idea:
/**
* ## Deno <-> Node.js Compat (a CLI)
*
* - Deno source, two build targets: nodejs (esm & cjs) & browser (esm)
* - For nodejs target:
* + ** rollup babel as very first plugin (with babelrc: false)
* to transform url (deno apis) to bare specifier (nodejs apis)
* + output cjs and esm rollup "format"s
* - For browsers target:
* + detect if URL specifier ends with `.ts`
* + then rollup url import/resolve plugin (as very first),
* + then rollup babel (as very first) transform (with babelrc: false)
* the typescript to javascript, then allow user-land (allow babelrc)
*
*/
** another variant of that step is to directly URL import/resolve from Deno and transform that TS to JS, BUT that way consumers of that nodejs code source may have problems if there are any significant differences or bugs or behaviors between what Deno gives, for example, for the path.join
and what Node gives for path.join
Write mainly in Deno. And if you want to support both Browsers and Node (and Deno, hence the source) you can use a small smart CLI that can be created, nothing fancy, just a thin translation layer using @rollup so we can output whatever format we want. It's would probably be best to be written in Node.js and then bundled as single file binary.
I think the best course of action is just to convince typescript peeps to allow explicit file names.
Don't know if this is the best place to post this, but I wanted to share the pattern I came up with for running code on both node and deno. I append this to the end of a .mjs
file, which I can run in Node 12+ (using --experimental-modules) or in Deno.
function main (script, args, exit, host, handler) {
if (args.length !== 1) {
console.log('Usage: '+script+' FILE');
exit(1);
}
const filename = args[0];
if (host === 'deno') {
const decoder = new TextDecoder('utf-8');
const source = decoder.decode(Deno.readFileSync(filename));
return handler(source);
} else {
return (async () => {
const fs = await import('fs');
const path = await import('path');
const source = fs.readFileSync(path.normalize(filename), 'utf8');
return handler(source);
})();
}
};
if (typeof process === 'object' && process.argv[0].endsWith('/node') && import.meta.url == 'file://' + process.argv[1])
main(process.argv[1], process.argv.slice(2), process.exit, 'node', MYFILEHANDLER);
else if (typeof Deno === 'object' && import.meta.main)
main(location && location.pathname, Deno.args, Deno.exit, 'deno', MYFILEHANDLER);
Not up to the needs of library authors, I know, but it's useful to have such a recipe for quick scripts that might be run on either host.
I expanded that recipe a bit, folding in the options parsing from std/flags
, to be usable on both node and deno. Result is here.
I’ve finally made my first foray with Deno. The unpkg ?module
strategy almost works, except now I have a bunch of TypeScript errors showing up for ES5 transpiled code.
If I transpile to ESNext I worry that I will mess with dependents who don’t transpile their sources (for things like async generators). And ideally I would be able to reference my modules locally for local development, but this second library I’m working on has a single dependency (also owned by me). Anyone have any updates on best practices?
Coming up with a concrete story for dual publishing to npm and deno would get me to switch. unpkg might work for a while but it would suck to have to publish for local development.
@brainkim take a look at your package on Pika.dev: https://www.pika.dev/npm/@bikeshaving/crank
Try loading it from there... It "should" serve up the .d.ts
files to Deno using the X-TypeScript-Types
header which then Deno will use when type checking the code, instead of trying to type check the JavaScript. If it doesn't work, they would very much like to work with you I suspect to get it to work.
@kitsonk Tried out the link and got a 404. The module which pika points to doesn’t seem to exist. If you’re saying pika can allow me to dual publish to npm and deno then I’ll check it out and see if I can get it to work. I was initially turned off by references to some all-in-one editing solution but I’m willing to give it a shot.
Figuring out local development might still be a pain though.
Go to that page, there is a different URL listed on the page. Pika CDN is a good CDN for both browsers and Deno for packages that contain ESM modules and types.
Hopefully @axetroy can help with the development environment end for Vscode. The type information merging is a relatively new feature and I don't know if his plugin understands it well. The information is in the Deno cache, so the plugin could support it.
https://cdn.pika.dev/@bikeshaving/crank@^0.1.0-beta.4 responds with a file like this:
/*
Pika CDN - @bikeshaving/crank@0.1.0-beta.5
https://www.pika.dev/npm/@bikeshaving/crank
How it works:
1. Import this package to your site using this file/URL (see examples).
2. Your web browser will fetch the browser-optimized code from the export statements below.
3. Don't directly import the export URLs below: they're optimized to your browser specifically and may break other users.
Examples:
- import {Component, render} from 'https://cdn.pika.dev/preact@^10.0.0';
- import {Component, render} from 'https://cdn.pika.dev/preact@10.0.2';
- import {Component, render} from 'https://cdn.pika.dev/preact@next';
- import {Component, render} from 'https://cdn.pika.dev/preact';
- const {Component, render} = await import('https://cdn.pika.dev/preact@^10.0.0');
Learn more: https://www.pika.dev/cdn
*/
export * from '/-/@bikeshaving/crank@v0.1.0-beta.5/dist=es2019/crank.js';
export {default} from '/-/@bikeshaving/crank@v0.1.0-beta.5/dist=es2019/crank.js';
https://cdn.pika.dev/-/@bikeshaving/crank@v0.1.0-beta.5/dist=es2019/crank.js responds with a 404 with the following message:
Not Found: This package doesn't match hash "beta.5".
This probably has something to do with the dash or something.
This is a maze with lots of possible paths, and I’m not sure how pika helps over something like unpkg, but I’m happy for any assistance. Like I said, creating a concrete story for devs who want to dual publish is of paramount importance. Especially if you want to pick up authors who author “universal” modules, deno offers so much in its reproduction of standard DOM apis.
Try without the caret: https://cdn.pika.dev/@bikeshaving/crank@0.1.0-beta.4
Same thing: https://cdn.pika.dev/-/@bikeshaving/crank@v0.1.0-beta.5/dist=es2019/crank.js
The problem probably is the dash is being parsed weirdly by pika.
For what it’s worth, Deno finding types via an X-TypeScript-Types
header seems no better to me than package.json types
fields or d.ts
sibling files. I want module resolution to be obvious, and having to inspect responses for a header seems worse (to me).
@brainkim
What strange module is this?
export * from '/-/@bikeshaving/crank@v0.1.0-beta.4/dist=es2019/crank.js';
I want module resolution to be obvious, and having to inspect responses for a header seems worse (to me).
You don't, Deno does it automatically. The header is an option for a CDN to be able to provide it to Deno without modifying code. You can be explicit too... it is covered in the documentation.
@axetroy it’s code generated by pika.
@brainkim Yeah. I have solved the problem in vscode-deno
Really well-phrased question in the original post. It's sad that such a banality is preventing nice interop, and even more sad that even if the library author wants to go the extra mile, there is just no way to do it.
My wish for a solution would be that:
Since I have this problem with a couple of libraries that I would like to use in Deno, I threw together today:
Basically just a proxy that replaces import './foo'
with import './foo.ts'
(or .js). I believe www.pika.dev is doing similar things, but way more advanced. This is my code.
It's surprising how many things "just work", e.g.:
import * as buntis from 'https://stupid-extensions.com/raw.githubusercontent.com/buntis/buntis/master/src/buntis.ts';
import * as acorn from 'https://stupid-extensions.com/raw.githubusercontent.com/acornjs/acorn/master/acorn/src/index.js'
import * as moment from 'https://stupid-extensions.com/raw.githubusercontent.com/moment/moment/develop/src/moment.js';
import * as runtypes from 'https://stupid-extensions.com/raw.githubusercontent.com/jakajancar/runtypes/master/src/index.ts'
@jakajancar This is cool! The one issue I have with these solutions is that they aren’t amenable for local deno development. In other words, you hav to publish online to consume these local libraries, implying a node first development workflow.
With github’s purchase of npm I must admit my interest in deno has waned somewhat, but I’m still eager to see what people come up with. I’m still holding out for the possibility that typescript files can be referenced locally from both node and deno, but this might require some magic on the node side.
It's sad that such a banality is preventing nice interop
I take offense to this comment. It is a complex issue, but we have delivered lots of features over the past few months that help support this. It is a complex issue. If we simply threw up our arms and cloned Node.js that would not help advance anything. Trying to keep true to the architecture and principles of Deno, while making it easy to consume in multiple runtimes is a hard chore.
I believe www.pika.dev is doing similar things, but way more advanced.
Pika.dev/cdn uses Rollup to generate single file ES Modules that are minified. Totally different strategy.
@kitsonk No need to take offense, it was not directed at Deno (or any other) team. Just seems unfortunate that there are projects that have dozens of files and many thousands of lines, and all that prevents them from being used are extensions 😖 I love the philosophy and simplicity of Deno's imports and am happy that it interoperates with the browser (which is more important target than Node.js).
I think my situation fits this issue as well. I maintain the yaml
package on npm, and would like to make it available in Deno as well. The source code is written as ES modules, uses .js
for all imports, and has no external dependencies. Basic testing would indicate that the sources work as-is in Deno.
For backwards compatibility, the files I publish on npmjs.org are bundled and transpiled for use in Node.js and browser bundlers. My test suite uses Jest, and does include browser tests as well. I provide manually crafted d.ts files for TypeScript users.
It would be useful to have a documentation page somewhere targeted at a person like me, with some of the following instructions:
yaml
, yaml/types
, and a couple of others in Node.js. In the sources, I'm using a named export from src/index.js
while the public Node.js endpoint uses a default export. What's the most Deno way of resolving this difference?"exports"
and "type": "commonjs"
in my package.json for Node.js support. Does that or will that matter at all for my Deno users?I've been thinking about this a lot as well.
Similar @brainkim's original post I too would like to have author a library as deno and be able to use it with node.
I've been working on a cli tool for downloading import-urls in node. My implementation isn't lined up with deno's and supports node modules and a whole bunch of other node-specific stuff, like supporting no extensions (a real catch-all), but I think it might be key for a node post-install hook that downloads the deno-modules if you publish on npm.
I personally really want to do npx ts-node-run <url>
and have an equivalent for deno run <url>
in node. Ideally I don't want to have to use or publish npm modules for node.
But I'm starting to think I jumped over a step. The main issue is the .ts
import extensions and more minimally top-level await. We need a something that handles these deno-features. I'm interested in creating a set of babel
or ttypescript
? plugins that can accomplish this if there's really no other way for deno to do it.
Also people have been using this in tsconfig to mimic deno in node, and this doesn't handle ports correctly, so we need a proper plugin to transform urls to match deno as well.
"paths": {
"http://*": ["../../.deno/deps/http/*"],
"https://*": ["../../.deno/deps/https/*"]
},
I wrote this earlier today on stackoverflow.
I too just looked at deno bundle
and had some hope. I was also looking for a way to export .d.ts
files from deno and didn't see anything. I know nothing about how deno or v8 works but i'd expect deno to have to convert the ts into js somehow? Obviously it would be wonderful for deno <magic-to-node>
command to convert deno code into npm packages for us.
We basically need a way to convert deno-ts
into node-compat-ts
and rely on the deno vscode plugin. I don't believe it's deno's responsibility, and it's not gonna happen a without all the separate pieces including the deno global node shim.
Anyone feel free me to msg me directly if you want to chat more and brainstorm.
Update: I created a repo here for some deno-comat testing ideas https://github.com/reggi/deno-compat, I just put together a babel plugin for removing or adding extensions.
Hello,
We are in a very similar situation as described in the original post.
What we discovered is you can somewhat use babel in deno - transformFile
and transformFileSync
give errors about browsers because the deno
types don't match node
, but transform
works.
What we ended up doing was programmatically use Deno.bundle()
to make the AMD and pass that as our source to babel.transform
import babelCore from "https://dev.jspm.io/@babel/core";
import presetEnv from "https://dev.jspm.io/@babel/preset-env";
const [, source] = await Deno.bundle("./mod.ts");
const { code } = babelCore.transformSync(source, {
filename: "dist/index.js",
presets: [presetEnv],
babelrc: false,
configFile: false,
});
console.log(code);
then we can redirect that like we would with deno bundle mod.ts
anyways. We then created a test project and imported he above output as a js file
// @ts-ignore
import { Serializable, SerializeProperty } from "./ts_serialize/index.js";
class Test extends Serializable {
@SerializeProperty()
testName = "toJson";
}
// @ts-ignore
console.log(new Test().toJson());
// @ts-ignore
console.log(new Test().fromJson(`{"testName":"fromJson"}`));
set up a tsconfig.json
for the test project
{
"compilerOptions": {
"target": "es5",
"module": "commonjs",
"experimentalDecorators": true
}
}
then ran the tsc
against that project and node test.js
that output.
We're still figuring this whole process out. We have to support types in the node environment to remove all the // @ts-ignore
in the test project, which we might have to do by hand. That said we're investigating generating types from the output of deno doc --json ./mod.ts
Our project is hosted here: https://github.com/GameBridgeAI/ts_serialize and the WIP branch node_deploy
is here https://github.com/shardyMBAI/ts_serialize/tree/node_deploys
Okay, another thing, if we didn't mention it already.
I think that if we have repo with src/
that uses URLs (jspm, unpkg, jsdeliver, pika, skypack) for external deps and compiles/bundles for npm with Rollup/Webpack + import-http
by @egoist, then we will be able to use the raw untouched js/ts source directly with Deno (pointing to the github repo or some service), and in the same time, we will be able to use it in Node thru npm/unpkg.
Oh wait, yea, the different globals and APIs. But it will work for small libraries. The thing I'm bouncing around constantly is, should we concentrate on writing Deno and compile to Node or the opposite :laughing: Which seems easier? Hm :thinking:
I don't know if it's helpful, but I just started playing around with an approach yesterday where I can write Deno and bundle it with polyfills for a portion of the Deno runtime for quickjs. The resulting bundle works with either runtime.
Early experiments at jkriss/quickdeno.
I've created make-deno-edition to make npm packages written in typescript compatible with deno - is working on badges - usage proof of this here https://repl.it/@balupton/badges-deno - has been used now to make 32 node packages compatible with deno - you can use project to automatically generate the readme instructions for the deno edition - and can use boundation to automatically scaffold your projects to automate the entire process - start-of-week is an example where different entries are used for node, deno, and web browsers
Here are some techniques I’ve discovered for node -> deno publishing while trying to make my frontend framework deno compatible.
@ts-ignore
the import module specifier.import {
Children,
Context,
Element as CrankElement,
ElementValue,
Portal,
Renderer,
// @ts-ignore: explicit ts
} from "./crank.ts";
You can add a @ts-ignore
directive on the line directly above the module specifier. This technique works surprisingly well, and if you do this, your source typescript files will be directly importable from deno. This messes up d.ts
files but I have another hack later on to address this.
@deno-types
comments and triple-slash directives
// @deno-types="https://unpkg.com/@bikeshaving/crank@0.3.0/index.d.ts"
import {createElement} from "https://unpkg.com/@bikeshaving/crank@0.3.0/index.js";
// @deno-types="https://unpkg.com/@bikeshaving/crank@0.3.0/html.d.ts"
import {renderer} from "https://unpkg.com/@bikeshaving/crank@0.3.0/html.js";
You can import modules without cooperation from the package author by importing the JS equivalents, and referencing the d.ts types using a @deno-types
directive. Many complex libraries have d.ts
files which do not actually work with deno, so your mileage may vary.
If you are the package author, you can use the following rollup plugin snippet using the package "magic-string" to prepend a triple-slash reference (/// <reference-types="index.d.ts" />
) to your build artifacts.
rollup.config.js
import MagicString from "magic-string";
/**
* A hack to add triple-slash references to sibling d.ts files for deno.
*/
function dts() {
return {
name: "dts",
renderChunk(code, info) {
if (info.isEntry) {
const dts = "./" + info.fileName.replace(/js$/, "d.ts");
const ms = new MagicString(code);
ms.prepend(`/// <reference types="${dts}" />\n`);
code = ms.toString();
const map = ms.generateMap({hires: true});
return {code, map};
}
return code;
},
};
}
export default {
/* other rollup options */
plugins: [/* plugins */, dts()],
};
This allows Deno consumers to import your rollup build artifacts and have them be typed, without any extra effort on their part. This again assumes the d.ts files work out of box, which is not guaranteed.
Between @ts-ignore
-ing of module specifiers and this method, I’m not sure which technique is better. The former technique allows you to use your ts source files directly, but you might want to funnel both deno and node users to use the same build output for consistency, especially if you have source transforms or other compilation magic going on in your codebase. Additionally, you have to add @ts-ignore
directives to every import in your source code, which can be a hassle.
The latter technique of relying on rollup or similar is nice, but it precludes serving your modules from raw.githubusercontent.com or deno.land/x because we typically don’t check in build artifacts into source control. I found that this technique works okay when using unpkg, although I am getting weird type errors when trying to use semver redirects.
Method 1 causes typescript to produce d.ts files with module specifiers that have file extensions (export * from "./crank.ts"
), which will throw off typescript when used from node. Method 2 will produce d.ts files with bare module specifiers (export * from "./crank"
), which will work depending on server configuration. For instance, unpkg will redirect that bare specifier to the js file, which according to method 2 should have a triple slash reference, so everything will work out fine. However, if you reference the build artifact locally, you will get errors from deno about the missing file extension.
You can use the typescript transform ts-transform-import-path-rewrite
to rewrite module specifiers in d.ts files. The typescript custom transformer API is basically undocumented, and works differently according to the build tools you use, but with a bit of trial and error I got the above package working with rollup-plugin-typescript2
as follows:
rollup.config.js
import typescript from "rollup-plugin-typescript2";
import {transform} from "ts-transform-import-path-rewrite";
/**
* A hack to rewrite import paths in d.ts files for deno.
*/
function transformer() {
const rewritePath = transform({
rewrite(importPath) {
// if you use method 1, you need to replace the `.ts` with `.js`
return importPath + ".js";
},
});
return {afterDeclarations: [rewritePath]};
}
export default {
/* config options */
plugins: [ts({transformers: [transformer]}), /* other plugins */],
}
You can refer to the full rollup.config.js
file to see what I did in context, which is a combination of methods 2 and 3. I might add method 1 later, except I found that unpkg works okay. Like I mentioned I’ve discovered weird errors when unpkg redirects urls for semver reasons, but then again deno.land doesn’t seem to have semver features yet, so I don’t really care.
Needless to say, anyone who is even partly responsible for this omnishambles should feel bad. I will not be turning these techniques into a library/module because I don’t want to maintain it, and in an ideal world, the TypeScript team will stop stonewalling on explicit file extensions in module specifiers, and all these techniques will be obsoleted. In that respect, I want to say good job to all the people writing constant “Any progress on this” comments in the related TypeScript issues and pinging the TypeScript maintainers directly with rambling, bad-faith arguments. With persistence we shall win.
In the meantime, these techniques seem to work for smaller libraries, and you should feel free to copy-paste the code in my rollup config into your own libraries.
@tunnckoCore
The thing I'm bouncing around constantly is, should we concentrate on writing Deno and compile to Node or the opposite 😆 Which seems easier? Hm 🤔
I think writing for node and compiling to deno is unfortunately much easier than writing for deno and compiling for node as of today (July 30th). The big difference is that node tooling is just much further advanced (bundlers, testing). I think at the end of the day, a tool which turns deno modules into real packages would be great, but I don’t think the deno ecosystem is there yet for dual publishers.
My solution: YouTube - Integrating Deno and Node.js
I just came across this:
https://github.com/garronej/denoify
It a tool to author node
and convert to deno
.
I'm made another tool, inspired by denoify (that didn't meet my requirements):
https://github.com/marcushultman/denoc
Smaller (single dependency), more explicit but with simpler setup.
I'm still very interested in creating authoring deno and a way to convert it to node (deno -> node). ❤️❤️❤️
Maybe seeing how https://github.com/ebebbington/context-finder is setup might help
@reggi :wave:
Hello, I have written a simple project in deno
called ts_serialize
, however, it is all vanilla Typescript with no dependencies (from either deno or anything else). And in our CI I generate a build for NPM
(npm package) and run some tests on it.
We manually keep a .d.ts
file that we copy over for types. But everything else is automated.
These are the files I need to create the npm build
First, we use babel_ts_serialize.ts
which uses deno bundle
to make the UMD file then we pass that into bable.transformSync.
the build file contains the steps to make the folder and the publishing happens in the release CI.
To test the node package before deploying, we set up an example node project that uses *
the lastest in its package file then we just npm link
the recent build and run the standard test command npm test
.
create_npm_package_file.sh
does just that, but takes input for the version number in the package file.
The release is triggered via pushing a new tag that starts with v*
.
Came across esm.sh a CDN that polyfills Node core packages with the Deno Node compatability layer and converts to ES.
Looking through the yargs it seems they take a simpler approach without converters like Denoify. It seems like they are writing Typescript using the appropriate tsconfig.json
options to use ES5 modules with extensions. They use TS to build the native browser ES version, Rollup to generate the CJS build from that, and for Deno they just have a mod.ts
or something to import stuff from the source code. For Deno/Node native stuff they use a pattern of injecting shims into the core.
Has anyone tried to do this with WASI?
The code to initialize the WASI context is slightly different between Node and Deno, otherwise should be fine as we're fairly compliant.
Let’s say that I have a library which is published on npm. If that library
what is the best way to add support for deno? Is there a good example repository which does this?
Additionally, if I have library which depends only on modules which dual support both node and deno, does this story change?
Related: https://github.com/denoland/deno/issues/2644