Closed lucacasonato closed 1 year ago
If it helps, as a module maintainer a use case I've had for dynamic imports in NPM packages to only load modules when needed (e.g. depending on options passed in by whatever is consuming the module) - i.e. to load things regarded "optional peer dependencies" at runtime, such as drivers for a specific database. Not supporting dynamic imports causes problems for applications that use the relatively small number of modules that depend on this feature.
It turns out lack of dynamic import support is also a problem for some other build tools and in certain run time environments (e.g. serverless environments, particularly where there is some sort of custom bundling done as part of a build or deployment step).
Some module users ended up reporting issues with the dependency on dynamic imports, so much so I looked around for other options and found require_optional which is a little known package but used by the mongodb driver. IIRC it comes with it's own issues though, as it assumes package.json exists (and I seem to recall doesn't currently behave nicely if it doesn't - and on things like some serverless platforms, the bundling process meant there wasn't one distributed with the code by default) so it provides an ugly solution of sorts, at least for module maintainers.
Outside of modules I'm not sure of a good use case that absolutely needs dynamic imports - as there are other ways to do conditional includes at build time (e.g. for database drivers or i18n files) - but as a library maintainer I don't think there is another way around the problem of having a range of optional peer dependencies, at least not short of publishing multiple modules (or at least multiple entrypoints in a module).
Outside of modules, in the uses cases I've had, converting them to standard imports would work (if that is possible?) but I'm not sure if hoisting them would have side effects in some packages because I don't know off hand what the differences are between behaviour of import
and require
are (e.g. if dynamic import
statements cache exactly like require
, presumably doing that would be fine unless they were doing something shady like looking for env vars that are not set until some point at runtime?).
In those worst cases, if a module contains a dynamic import and it is converted into a static import by a build process, then including all the dynamically imported modules in the build is a possible work around, although this of course can really bloat the build size if there are a lot of dependencies (the point where the build exceeds the maximum allowed build size on some run time platforms).
As a practical example I actually had some users of a library I maintain do this and just bundle the mongodb lib with their application, even though they were not using a Mongo database, because their build process did not understand how to handle a dynamic import and treated it like a static import statement. Obviously not ideal but worked.
Outside of modules I'm not sure of a good use case that absolutely needs dynamic imports - as there are other ways to do conditional includes
I might have a use case, but I'm not entirely sure that there aren't alternatives. I'm currently developing a cli application that can be extended with plugins, instead of using a json file that contains the name of the plugins to load you can just import
a plugin from a url or write your own plugin or debugging code inline with the config, very much like webpack and rollup do it with their config files.
Besides that dynamic imports are defined in the ES2015 spec and therefore IMHO it should be supported regardless of the possible use cases.
@lucacasonato We could support dynamic imports with Base64 modules
Yeah I agree with @andreespirela it may be better to just support base64 modules so like in the example of @danielr1996 you could instead just use file system APIs to read the file and then convert to base64 and then dynamically import it.
The problem with data URL modules at the moment is that we decode them and write them to the Deno cache, so it isn't just like we can twiddle a couple knobs and have them work. The thing with compile is that the JavaScript bundle is "all" the code needed, with no external references outside of that JavaScript modules, which then allows us to be able to load that single module into the compiler, all the code that does dependency analysis, and ability to convert and cache resources is stripped out in the "thin binary" that is used by compile.
Since dynamic import can splice desired strings, it is difficult to perform static analysis.
So, we should think about how to solve it from another angle
I think this can be resolve by https://github.com/tc39/proposal-import-assertions
// Pack the typescript file into binary as plain string
import code from "./foo.ts" assert { type: "plaintext" };
// Dynamic execute the code
await import(`data:base64,${btoa(code)}`)
EDIT: The only thing you may need to pay attention to is the memory leak, after repeated execution, the memory is not released
I think this can be resolve by https://github.com/tc39/proposal-import-assertions
// Pack the typescript file into binary as plain string import code from "./foo.ts" assert { type: "plaintext" }; // Dynamic execute the code await import(`data:base64,${btoa(code)}`)
We already came to terms with the fact that dynamic imports aren't meant for reliable static analysis, ever since we unsupported the best-effort behaviour that existed with tsc's bundler. IMO it's much better to leave it at that and never attempt any static resolution of dynamic imports. So in compiled executables, dynamic imports = runtime module loading.
@nayeemrmn Are import assertions already in V8? If so, are they coming in Deno 1.8? They're in stage 3, that means they are being worked on for v8 impl, right?
@andreespirela Yeah, I believe import assertions are coming in Deno 1.8 because right now they are parsed but they are ignored.
Maybe this should be marked with the "compile" label?
I'd love to be able to use denon
as a stand-alone utility or another deno
-similar variant (with full deno
capabilities) as a cross-platform task runner or specialized script runner (eg, similar to perl -ne ...
). Being able to produce such completely stand-alone variant executables would be an extremely attractive quality for devs.
file-system router needs dynamic import.
Just like:
const fn = (
await import(`./pages/${path}/${method.toLowerCase()}.ts`).catch(() => {
return { default: fn404 };
})
).default;
return fn(reqBody, reqHeaders);
Any update on this? Our team need dynamic import in Deno binary.
Any update on this? Our team need dynamic import in Deno binary.
No update. What is your use case?
@bartlomieju
We are using deno compile
to binary package a code-generator. The code-generator also allows for customization - user can add their own code to the base generated code.
The code-generator is designed to work over two passes.
The second pass dynamically imports the user code and further generates boiler-plate code. (SDK/API end-points, documentation etc).
This is not currently possible without the dynamic import capability in the deno compile generated binary.
A second use-case is to use the binary as a domain-specific REPL, over the functionality of the generated+customized code. Basically, load the generated module, and allow for convenience commands over a REPL (auto-completion and more).
In both these cases, we'd be happy to take responsibility for correctness of the imported code - idea being to run type-checker/transpiler or similar on the imported code to ensure correctness before attempting the dynamic import. That is specifically not a concern for us.
A resolution would be much appreciated.
I built a sandbox module, and also encountered this problem.
I just wanted to add that we have a use-case too.
We have a CLI utility we're using that generates configuration files in TypeScript, and then load these files at runtime. Looks similiar to https://github.com/varHarrie/simple_sandbox
The utility uses deno compile
to create an executable.
The error emitted is TypeError: Module not found
.
Another usecase for a configuration generator here: reimplementing jkcfg on top of Deno for built-in typescript type checking support.
To add a use-case: I'm building a compiler in Deno that outputs JavaScript. I want to ship the toolchain as a standalone executable, but it needs to be able to run the JavaScript code that it generates. One example is for running tests, another is that project-configuration is defined in the language being compiled, so the compiler needs to import it from the JS file where the config lives. I would really like to avoid requiring the user to install a separate JavaScript runtime for such features
In case it matters, I don't need type-checking or any other features of Deno to be performed on the code for this usecase. I just need to be able to access the exports from my code
Closing thought: I see that part of the problem is the Deno binary has been pared down when doing a compile, so that the end bundles are smaller. This makes sense as a default, but maybe we could have a flag that retains the bare-minimum functionality needed to enable dynamic imports? I for one would be totally happy to accept a thicker bundle for my usecase
I think this is essential, browsers already have support for dynamic imports
I'm having an issue with deno compile
and dynamic imports. My Clojure interpreter bebo which runs on deno uses dynamic import.
Example:
foo.js:
import { runScript } from 'https://cdn.jsdelivr.net/npm/bebo@0.0.6/lib/bebo_core.js'
// import "https://deno.land/std@0.146.0/http/server.ts"
await runScript(Deno.args[0]);
foo.cljs:
(ns foo
(:require
["https://deno.land/std@0.146.0/http/server.ts" :as server]))
(def port 8080)
(defn handler [req]
(let [agent (-> req .-headers (.get "user-agent"))
body (str "Your user agent is: " (or agent
"Unknown"))]
(new js/Response body #js {:status 200})))
(server/serve handler #js {:port port})
Compile:
deno compile --allow-all -o /tmp/foo /tmp/foo.js /tmp/foo.cljs
Run:
$ /tmp/foo
error: Uncaught (in promise) TypeError: Module not found
await runScript(Deno.args[0]);
^
at async file:///tmp/foo.js:5:1
When I uncomment:
// import "https://deno.land/std@0.146.0/http/server.ts"
in `foo.js, then the compiled bundle works.
So perhaps it would be useful to have the compile option support a list of imports that should be preserved, or maybe there is already such a feature?
Adding our use-case is a CLI with
graphgen inspect path/to/your/custom/setup.ts
)The problem with data URL modules at the moment is that we decode them and write them to the Deno cache, so it isn't just like we can twiddle a couple knobs and have them work. The thing with compile is that the JavaScript bundle is "all" the code needed, with no external references outside of that JavaScript modules, which then allows us to be able to load that single module into the compiler, all the code that does dependency analysis, and ability to convert and cache resources is stripped out in the "thin binary" that is used by compile.
@kitsonk What is the advantage of having a "thin" binary?
It would be good if this gap was documented explicitly in these locations to avoid more people ending up needing to +1 this ticket...
For the use-case of implementing plugins for a CLI, would it make sense to be able to exercise some type of control over the import map used in the dynamically imported module? If the plugin wanted to use its own import map, or you wanted to isolate it completely from the imports of the main executable, this would feel desirable. For that matter, allowing plugins to have some control over how they are executed via a configuration, would be nice as well E.g.
const moduleEnv = createModuleEnv({
importMap: '/path/to/plugin/import-map.json',
configurationFile: '/path/to/plugin/deno.json',
allow: ['env', 'read'],
})
const module = moduleEnv(() => import('/path/to/plugin/mod.ts'));
For those coming here looking for a workaround, one option is to dynamically load plugins as CommonJS packages using createRequire()
, which works quite well. It's not ideal to have plugins be constrained to being npm packages, and I hope that there is a better way to do it in the future, but at least for our use-case and for the time being, it's better than nothing.
And in theory at least, by using https://github.com/deno/dnt you might even be able to support authoring plugins "natively" in Deno, although it would require a hidden dependency on npm
CLIs with js-based config also need this feature
I have the same use case of .js configs as well as runtime-loadable plugins.
Deno 1.31 added support for statically analyzable dynamic imports: https://deno.com/blog/v1.31#deno-compile-works-with-statically-analyzable-dynamic-imports
Deno 1.32 (releasing later this month) will add support for web workers (https://github.com/denoland/deno/commit/090169cfbc6699486765b729d532b5b837210b12) and the ability to specify non-analyzable modules to include in the binary (https://github.com/denoland/deno/commit/b64ec7926831896f4e43b685891111409de45e85)
Should a new ticket be spun up now? It doesn't sound like any of the above features support dynamic import of modules not available at compile time which was asked for a few times in this thread.
@RichiCoder1 yes, I'd recommend a new targeted issue for that. It will also help it be more easily searchable.
@RichiCoder1 yes, I'd recommend a new targeted issue for that. It will also help it be more easily searchable.
done! thanks for the update
Thanks @dsherret!
The committed changes make some headway toward the full dynamic import, but many of the use-cases mentioned in the previous discussion are still not possible to implement.
See https://github.com/denoland/deno/issues/18327 (thanks to @alcuadrado) for continuation of the issue.
This is a tracking issue to investigate how we can support dynamic import in
deno compile
, and what the use cases would be.