microsoft / TypeScript

TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
https://www.typescriptlang.org
Apache License 2.0
99.85k stars 12.36k forks source link

What’s confusing about modules? #51876

Closed andrewbranch closed 6 months ago

andrewbranch commented 1 year ago

After #51669 is merged, I plan to write documentation for it, and try to update/rewrite a bunch of our existing module-related documentation. While there are a lot of good examples in this issue tracker of specific questions and misconceptions about modules and TypeScript’s module-related options, I thought I’d ask explicitly what questions you have and what aspects of the module landscape or our configuration specifically are the most confusing.

GabenGar commented 1 year ago

Is there a difference between import moduleName from "module-path"; and import { default as moduleName } from "module-path";?

andrewbranch commented 1 year ago

(I don’t plan to answer all or even most questions here, but if the answer is quick and contains any context that may not make it into the docs, I might.)

@GabenGar no, except there accidentally was a difference in type resolution for a while, but it’s fixed now: https://github.com/microsoft/TypeScript/issues/49567 fixed by https://github.com/microsoft/TypeScript/pull/49814

treybrisbane commented 1 year ago

Any chance you could include a dedicated section or two on why it's not practical to have the compiler transform imports ending in .ts to ones ending in .js? 🙂

I've seen a few of the TypeScript team posting some pretty solid rationale in various GitHub comments over the years, but I can never remember all the reasons, and finding those comments again is quite difficult. It would be amazing to get one clear, objective, well-written explanation/rationale that could be easily linked to by anyone whenever this comes up.

IanVS commented 1 year ago

One thing that tripped me up was how .mts and .cts files interact with package.json exports fields. Especially in the case of dynamic import() (which always uses the import condition, even in .cts files).

Andarist commented 1 year ago

What are the exact requirements for publishable packages? And how do they relate to: package.json#type, package.json#exports, moduleResolution values? When do extensions in .d.ts matter? And what they should be?

khmseu commented 1 year ago

An import path cannot end with a '.mts' extension. Consider importing './XXX.mjs' instead.

Why does this error message even exist? It seems insane to direct tsc to import a .mjs when we have a .mts with types and everything. What am I missing here? Is there a way to direct tsc to the real type info? That suggestion certainly doesn't sound like something I want to do.

unional commented 1 year ago

I'm working on a demo about many issues with the current module, moduleResolution, allowSyntheticDefaultImports, and esModuleInterop combinations.

It's still a work in progress. I can probably finish it by this weekend or next week.

Feel free to check it out or cooperate about this topic.

https://github.com/cyberuni/typescript-module-resolutions-demo

bobaaaaa commented 1 year ago

My team is building a non-fancy web-app (not a library/framework) in a monorepo setup. Are there any different configs needed for app vs. library authors?

I would love to see some kind of cli check -> "Is my setup ok?" like brew doctor or https://github.com/bluwy/publint

unional commented 1 year ago

Cross linking the great post made by Ryan: https://github.com/microsoft/TypeScript/issues/49083#issuecomment-1435399267

There are still many things worth to discuss and figure out, for example:

But in general, that is a good read and should check it out.

GabenGar commented 1 year ago

In regards to second option, create a major version increase and ditch CJS with it. Supporting both will not make it easier neither for you nor consumers.

unional commented 1 year ago

In regards to second option, create a major version increase and ditch CJS with it. Supporting both will not make it easier neither for you nor consumers.

In theory, yes.

But in practice, it's not that straight forward. It mainly come down to environment and tooling.

For example, I have a project that wants to be consumed as ESM, and at the same time, used as part of the global setup when running jest. Even today, with jest 29.4.2, doing NODE_OPTIONS=--experimental-vm-modules, using @repobuddy/jest etc. The global setup script still looks for the CJS variant in the exports field.

The situation is even harder within corporations, where you want to write some tools and use them in the company. There are products written many years ago and they simply can't migrate to ESM only toolchain, and probably never will.

Then what are you going to do? Supporting CJS only is not an option as you DO want to move your toolchain forward. You don't want to be part of the problem. So dual release or maintaining multiple versions is the only option.

Then, that comes to cost management. As managing multiple versions requires additional time, effort, and ultimately, moola.

GabenGar commented 1 year ago

For example, I have a project that wants to be consumed as ESM, and at the same time, used as part of the global setup when running jest. Even today, with jest 29.4.2, doing NODE_OPTIONS=--experimental-vm-modules, using @repobuddy/jest etc. The global setup script still looks for the CJS variant in the exports field.

You don't it find comical that a testing framework stands in a way of major version upgrade, even though the whole point of adhering to the TDD dogma is to reduce the pain of major refactors? It's time to switch to a testing framework which understands ESM (aka modern nodejs code), which means no magical bloated packages which transpile on the fly into who knows what and output green checkboxes. And yes, it does mean you can't run your unit tests without building them and the underlying code first.

The situation is even harder within corporations, where you want to write some tools and use them in the company.

This is kinda the company's problem. For that just transpile your code to cjs and don't worry. The only real enterprise way.

There are products written many years ago and they simply can't migrate to ESM only toolchain, and probably never will.

Then it should lock down dependencies and run on outdated stack forever,

Then, that comes to cost management. As managing multiple versions requires additional time, effort, and ultimately, moola.

It's actually even worse than management and development. Dual packages are the subject to dual-package hazard and that can contribute subtle bugs, which are pretty hard to debug and resolve.

IanVS commented 1 year ago

Seems like this isn't really the place to argue over ESM-only vs CJS compat...

GabenGar commented 1 year ago

The whole reason for this thread's existence is ESM/CJS interop issues (otherwise there wouldn't be anything confusing about ES modules), aka ESM vs. CJS. "ESM-only major semver increase" is as much of a valid advice to deal with this as whatever side-stepping hack can be concocted. The former has the benefit of being explicit and following the established development practices (it's literally a feature of semver), instead of burying it under a slew of configs/transpilers/bundlers/ad-hoc scripts, which WILL break every now and then with cryptic errors.

Case in point, while @andrewbranch 's answer to my question is technically correct, but only from purely typescript standpoint. The underlying issue with errors related to default exports can be summed up as the result of building pipeline looking like this:

async function coreFunction() {};

export default {
  default: coreFunction
}

Author/bundler/transpiler thinks it is module's default export of coreFunction(), but for NodeJS ESM resolver the default export is the object with default key. Which becomes a problem when you write import { default as moduleFunction } from "#module". Typescript obviously can notice this problem and error/adapt, but the snippet above is generated after typescript compiled the code. Hence why this problem is only encountered when dealing with 3rd-party packages, because typescript consumes them and sees this. It can only be fixed at upstream (or by an ugly hack in downstream on per-import basis), but the fix would change the module's surface so it's a... semver major increase for the upstream anyway (because people accustomed to CJS are iffy about doing require("#module").default and thus transpilers obliged). Good luck convincing the maintainer that a few lines in type definition files warrant a major semver though. How did I figure this out? I don't remember actually, but it was randomly stumbling upon an article or a github issue/discussion, maybe even on this repo. This isn't even the actual dual-package hazard, merely a symptom of the problem of writing "compatible" code between incompatible module systems. The actual hazard would be a package A used as a CJS export dependency in B and as a ESM export dependency in C, but the user assumes, looking at their type signature and exports, that they share the same context and use them as such. And then get bombarded by a bunch of seemingly unrelated errors.

These important takeaways:

So the answer to the question:

  • How to support both as a library? Maintaining multiple versions? Duplicate code?

Don't. Either do a ESM-only semver major upgrade or transpile it all to CJS-only. ESM knows how to load CJS modules (it's only a minor syntax annoyance for using them in ESM modules) and the ancient CJS code also knows how to load "modern" CJS modules. These are the only future-proof solutions. And it's a typical algorithmic optimization too, as you only have to deal with ESM or CJS problems, not (ESM + CJS) * interop_coefficient problems.

andrewbranch commented 1 year ago

Let’s keep this an uncluttered place for people to ask questions, please. Because GitHub issues don’t have a reasonable means of threading or replies, that means not answering other people’s questions, even if you have an answer or opinion. Thanks for understanding!

RichiCoder1 commented 1 year ago
  • How to support both as a library? Maintaining multiple versions? Duplicate code?
  • How to make the life of TypeScript library authors easier today. What can be done by TypeScript, community tools, library authors?

I wanted to emphasize these two. While being either fully CJS and fully ESM w/ extensions is straight forward, any package trying to bridge these two worlds falls into numerous quietly failing pitfalls.

Specifically for dual packaging lib authors, things that would be awesome:

I think the above would be tremendously helpful as those seem to be were many authors on twitter (and myself) trip over all the steps surrounding dual packaging, Node16, and now bundler.

me4502 commented 1 year ago

One thing that would be nice to document, is how to correctly setup modules where you're using a bundler that takes in ESM code, but also tooling like Jest that operates on CJS code.

As moduleResolution: node appears to not allow resolvePackageJsonExports etc to be enabled in TypeScript 5.0, a setup like this limits the codebase from using the new customisation functionality.

An ideal setup would allow use of package.json exports/imports fields, the app code to ouput ESM to the bundler (webpack, etc), and Jest to recieve CJS output (eg, using a tsconfig.test.json file). From my testing there doesn't seem to be a "perfect" setup here, so documentation around the best way to solve something like this / best practices IMO would be ideal

beorn commented 1 year ago

I'm taking this to mean module and module resolution. If I had to guess, I think what most people want is:

  1. A great developer experience

    • seamless compatibility between bundlers (such as vite) and tsc / vscode
    • not being pushed to deal with idiosyncrasies such as adding .js extensions to typescript module imports (extensionless)
  2. A module & resolution system they can understand

    • A path towards clearer and more standardized module & resolution systems that's actually possible to learn and which by default (with little/no config) just works for the most common cases
    • Guidelines around resolution methods in monorepos: tsconfig references vs workspace packages vs path aliases — too many ways to skin a cat
    • Tooling to give great observability into module & resolution problems — configuration linting, help pinpoint blame with 3rd party modules vs your own configuration or own code
    • Updated documentation that gives a great overview
  1. Future-proofing
    • A path to an ESM-only future, and tooling that gives a gentle hand pushing in that direction (for your own code, and a "manifesto" of sorts for the FOSS community)
    • Prepare for a typescript-first/only world — for most people that work in typescript having to even deal with dist/.js files is just a complication they'd rather do without. It seems the world is moving towards a typescript-first/only world, and i would be great to support and document how to do this better — e.g., how can packageA import from packageB through package.json exports/imports without dealing with .d.ts or .js files or configuration. This kind-of works in some situations, but it would be nice to see it improved and documented/promoted.

I'm sure thousands of man-years have been spent (in anger!) debugging JS/TS module and resolution issues. Most people have no clue what's going on, they just muddle through, turning on and off config flags until things break less. If you can make this more seamless you'd literally be saving thousands of lives — and bringing big smiles to all JS/TS devs 😀

A tangent: I have high hopes for the bundler mode, and with traceResolution there is some nice observability, but I was a bit disappointed to see that it seems to default to resolve in CJS mode with no apparent way to override?

connorjs commented 1 year ago

Echoing @beorn’s eloquent "path forward" and "future proofing" points in a more "Hey, I'm new" way.

  1. I come to JS/TS world hearing that JS rocks b/c you write the same thing on server and client/web
  2. But these are not the same when it comes to module resolution (I think, and other stuff)
    1. Tell me best practice ← This is my goal and what I came here to say: There should be strong recommendations 🤞🏻
    2. Set me up for success in the future
    3. Link to "in depth"* guides

*And maybe in depth guides come later, not now, or maybe just include a collection of resources.

Also, I think recommendations differ from "library" vs. "application" authoring, and I still find that odd X years later. If they are different, maybe this clarifies that too.


Awesome to see this issue! I was looking for bundler documentation on the TS website. Glad to see it's coming (soon).

Edit: I just re-read #50152 in depth, and it has a lot of great context. Unsure how I missed some of that info before. Pulling some of that into permanent documentation (or linking to it for context) would be great!

Andarist commented 1 year ago

this part from --traceResolution is pretty confusing:

======== Resolving module 'react/jsx-runtime' from '~/webstudio-designer/packages/authorization-token/src/index.server.ts'. ======== Explicitly specified module resolution kind: 'Bundler'. Resolving in CJS mode with conditions 'import', 'types', 'source'.

How does it run in "CJS mode" and uses an import condition at the same time?

andrewbranch commented 1 year ago

“CJS mode” just refers to the flavor of module resolution and doesn’t actually indicate anything about the kind of the importing or imported modules here. It means index files and extensionless shenanigans are supported. Do you have a suggestion for an alternative nomenclature?

Andarist commented 1 year ago

What I find more confusing about this is not the "CJS mode" but the mention of the import condition here.

andrewbranch commented 1 year ago

You wrote an import statement, so the import condition gets used. That’s how bundlers do it.

beorn commented 1 year ago

“CJS mode” just refers to the flavor of module resolution and doesn’t actually indicate anything about the kind of the importing or imported modules here. It means index files and extensionless shenanigans are supported. Do you have a suggestion for an alternative nomenclature?

At least I misunderstood what "resolving in CJS mode" meant — I thought it meant modules that were resolved would be assumed to be CJS. Is there another name that could be used to refer to the module resolution without involving ESM/CJS names (which I think will make most people think about the module format, not just resolution algorithm)?

jedwards1211 commented 1 year ago

One comment in a documentation example says that package.json "types" is a fallback for older versions of TypeScript, but it seems like TS is still resolving with "types" when "nodenext" resolution is used and the package doesn't have an export map?

Basically https://www.typescriptlang.org/docs/handbook/esm-node.html doesn't list all of the resolutions that "nodenext" will try for resolving type declarations so I'm not 100% sure how it behaves.

Andarist commented 1 year ago

One comment in a documentation example says that package.json "types" is a fallback for older versions of TypeScript, but it seems like TS is still resolving with "types" when "nodenext" resolution is used and the package doesn't have an export map?

If that wouldn't be the case then we probably wouldn't be able to use those new moduleResolutions for a long time.

andrewbranch commented 1 year ago

One comment in a documentation example says that package.json "types" is a fallback for older versions of TypeScript

That comment is specific to the example it’s contained in. It means to say that because there are exports, only old versions of TS would ever look at the top-level types. I agree that wording is pretty confusing. Thanks 👍

Andarist commented 1 year ago

It's unclear to me how one should use package.json#imports with TypeScript. tsc doesn't rewrite import sources in the emitted code but both source files and output files have to refer to the same "source" and yet they have to resolve to different locations in the package.

Perhaps this can be somehow solved by using either array targets, custom conditions or maybe by using the output paths in imports + declaration maps. It's far from clear how to actually achieve this though.

Andarist commented 1 year ago

Re package.json#imports... this can actually be solved with extra nested package.json files but traditionally TS didn't emit any additional files like this, and certainly not package.json files. This could easily conflict with other tools that might emit package.json for different reasons.

andrewbranch commented 1 year ago

by using the output paths in imports

This is the intended way. There is specific module resolution code to remap these references back to input paths, and it doesn’t depend on declaration maps.

Andarist commented 1 year ago

do u have a link to the said code or some test cases at hand? I'd love to dig more into this in the future as we definitely want to support this in https://github.com/preconstruct/preconstruct

andrewbranch commented 1 year ago

https://github.com/microsoft/TypeScript/blob/94564cf0730c1e5a9b5c860190b69a672e6985d3/src/compiler/moduleNameResolver.ts#LL2680C25-L2680C25

Mahi commented 1 year ago

Having to write .js extensions to TypeScript imports.

import { TypeThatDoesNotExistInCompiledJavaScript } from './example.js'

How is this sane behaviour? I'm writing TypeScript and importing a TypeScript type from a JavaScript file that a) doesn't exist, and b) won't contain the type even once compiled?

What if I want to use a different compiler that doesn't generate .js extensions? The files can be called .py and node jsfile.py would still work. Or what if I want to use ts-node and never compile the TypeScript to JavaScript in the first place, how does this import make any sense when the file won't be available even at runtime?

This doesn't match external module imports either, should I also start writing import fastify from './node_modules/fastify/lib/index.js' instead?

The issue seems to be "TypeScript doesn't alter import paths", so maybe the solution could be "it should"?

andrewbranch commented 1 year ago

I marked your question as duplicate because I skimmed it and it read to me like a more aggressive and provocative version of https://github.com/microsoft/TypeScript/issues/51876#issuecomment-1349874978. Reading it more closely, I’m less confident about whether each part is a good-faith question or rhetoric intended to ridicule the status quo and vent frustration. It seems like I don’t have the ability to unmark it as a duplicate so I apologize if I misread the intent.

The general question of “why .js extensions for .ts files” and “what about ts-node” I have already addressed in my in-progress docs. There are a couple other things that are maybe worth clearing up so I’m going to break the form and answer inline.

I'm writing TypeScript and importing a TypeScript type from a JavaScript file that a) doesn't exist, and b) won't contain the type even once compiled?

When I added allowImportingTsExtensions, I also made it legal to write an import type declaration with a .d.ts extension even if that option isn’t set, since that declaration is guaranteed to be erased. I’m not sure why I didn’t extend that same rule to .ts extensions. The compiler option is supposed to indicate that .ts extensions are supported by the runtime, but an import type declaration is known to have no runtime code, so anything TS can understand should arguably be legal. I will think about changing this.

What if I want to use a different compiler that doesn't generate .js extensions? The files can be called .py and node jsfile.py would still work.

Please do not name your JS files jsfile.py, but if you must, you must import it as "./jsfile.py", set allowArbitraryExtensions to indicate that importing a .py file into JS was not a mistake, and have a jsfile.d.py.ts type declaration for it if you want types. It seems like your question was assuming that TS would make you refer to jsfile.py as "./jsfile.js" which is not and has never been the case.


How is this sane behaviour?

Starting your questions this way is what made me immediately assume ill intent. I may have read too much into it. But unlike other more discussion-oriented issues where I appreciate spirited debate, this is my questions thread and I am going to exercise my authority to rule it with an iron fist. Be warned, any question that feels closer to “this is dumb I hate it” than “why is it this way” may get minimized 😄

Mahi commented 1 year ago

Yes that's fair enough, I have been fighting with this issue many times before and surely there was some repressed frustration deep down, also I am not a native so sorry if I wrote something horribly when I didn't mean bad towards you, but in the end it was genuine questions on the .js extension.

anything TS can understand should arguably be legal.

Yes that makes sense, thank you.

It seems like your question was assuming that TS would make you refer to jsfile.py as "./jsfile.js" which is not and has never been the case.

No my question is: why does the TypeScript code I write need to be heavily coupled to the compiler I will use in the future? Importing a simple example.ts has to be done as import ... from './example.js', which binds my codebase to a compiler that only generates .js files with same names as the original files. I can't use a compiler with name mangling or custom extensions (see .jsx so not too far fetched).

In my mind this isn't how compiler should work, I should write import ... from 'example.ts' and essentially say "hey compiler, however and wherever you decide to compile this .ts file, I want to import the end result". I guess my question is, where is my logic going wrong and why isn't this the case with TypeScript?

andrewbranch commented 1 year ago

Got it, I misunderstood your question with the .py example. You’re correct that there are some fairly deep assumptions baked in here. The original assumption was that tsc is the only TypeScript compiler, and that was true not so long ago! Even as other transpilers came around, we’ve found that continuing the model of assuming runtime behavior based on what tsc would do, and then getting your other compiler and tsc to agree as closely as possible by modifying both of their options, still works pretty well. The place where this has really started to break down most significantly is with runtimes and bundlers that understand TS natively, so we introduced allowImportingTsExtensions that can be used with noEmit, and I think that solves your example use case. That tells the compiler that it’s not going to do emit, and another tool that understands how to resolve to and process .ts files directly is going to handle it.

not-my-profile commented 1 year ago

I am really confused about why you need both --module node16 for tsc as well as "type": "module" in package.json ... isn't there some way to get the same type checking behavior without having to create a package.json file? That would certainly be convenient for quickly debugging packages that provide incorrect types under node16 module resolution.

Edit: opened #54876

trusktr commented 9 months ago

Is there a difference between import moduleName from "module-path"; and import { default as moduleName } from "module-path";?

off topic but there's a difference between ```js export default foo ``` and ```js export { foo as default } ``` The latter creates a live binding, while the former does not. [Solid playground without live binding](https://playground.solidjs.com/anonymous/3560e0e0-3eee-442b-8d0a-c523d5cafdd4) [Solid playground with live binding](https://playground.solidjs.com/anonymous/afce7c49-944d-4cac-a771-7d324ff61449) Type-wise no difference in TypeScript.
trusktr commented 9 months ago

In my mind this isn't how compiler should work, I should write import ... from 'example.ts' and essentially say "hey compiler, however and wherever you decide to compile this .ts file, I want to import the end result". I guess my question is, where is my logic going wrong and why isn't this the case with TypeScript?

@Mahi The problem is there are various different ES Module and non-ESM systems, and TypeScript needs to be configurable for all of them:

TypeScript cannot simply build code a single way, because then it would work in only one, maybe two, of those environments.

When you write TypeScript code, what is going to be importing your TypeScript code? Is it gonna be a browser ES module that will natively import your code? Is it gonna be a Deno ES module? Etc?

How will TypeScript know the output format that your code needs to be in, for it to work properly in some target environment, maybe even a non-ESM environment?

The answer is that in reality, there are currently too many ESM and non-ESM consumers (build tools or runtimes or both) with varying rules. Bun recently threw a wrench into the mix by mixing CommonJS with ESM in the same file (❔, don't do that unless you want to write very non-portable code).

why does the TypeScript code I write need to be heavily coupled to the compiler I will use in the future?

TLDR: because the compiler or environment you choose may have certain rules that others don't.

My advice:

Avoid moduleResolution:bundler if you can, and try to output the most vanilla ES Modules that you can so that they work with the least friction in systems that use import maps like browsers and Deno, because that format is closest to the ESM spec.

If you do this, then it'll be the simplest to get up and running with any alternative tools in the future with the least amount of config fiddling or code changes.

Basically: when you write code as close as possible to what a browser can consume, only having to set up an import map for bare specifiers referring to library names, then your code will be as portable as possible.

The bundler option may help you write less-portable code for a specific case, but that's something to avoid.

nlwillia commented 9 months ago

The new theory document is referenced from the tsconfig reference for paths, but the "There is a larger coverage of paths in the handbook" statement references a #path-mapping fragment that doesn't exist, and there doesn't appear to be any discussion of paths in the page. It's particularly confusing to try to understand paths in the context of the new bundler mode which seems to defer similar mapping configurability to the package.

andrewbranch commented 9 months ago

Thank you. You got linked there via a redirect from the old content. I need to update the link to point here: https://www.typescriptlang.org/docs/handbook/modules/reference.html#paths

akwodkiewicz commented 6 months ago
  1. Usage of package.json#imports. From the release notes I got the impression they are supported in the same way as package.json#exports, meaning you can create a mapping between a path and an object containing "types"/"default"/etc., so that Node will use the ./dist/foobar/*.js defined in "default", but TypeScript will use ./dist/foobar/*.d.ts defined in "types". ~But it's not the case, because package.json#imports only supports a string on the right side of the mapping (at least that's what I understood from the Node docs, however they are not too explicit here). So what's the expected setup for the "native" #imports usage in Node16? I apologize, but I did not understand the response here.~

    ~Is it that you need to emit the .js into the same directory as the source files? Or that you need to use path aliases to translate #foobar on the TypeScript layer (but if that was the case, then there's no particular "support" for imports, this should've been always possible).~

    EDIT: It turns out that I was wrong here, see next comments. Thank you, @Andarist, for clearing this out. But the conversation we had below only proves that this could've been explained better (e.g. point out that the feature is independent of path aliases and mention how to declare those imports properly using the "conditional" setup).

  2. Creating libraries, with the emphasis on internal libraries (in monorepos). How should the setup look like, assuming Yarn/pnpm workspaces (so @acme/app properly imports from @acme/package-a thanks to symlinks in node_modules, not path aliases nor relative imports). Having this info would be even more beneficial to TypeScript itself to further lead people away from (ab)using path aliases (which I learned very recently was meant to be used with RequireJS, can't find a link to the comment on GH though). Because if we already advice people to properly use the workspaces instead of abusing path aliases (source), then we could also add more info on how to declare the manifest for these internal packages properly. And to this day I have seen only monorepo guides on the Internet that set up path aliases to reference one package from another. Assuming that our best-practice TS monorepo setup does not use path aliases:

    1. Is it required for a monorepo to be using project references, or are those only to support build orchestration from tsc? Especially important with the popularity of the new tools that help with the orchestration (Nx, Turbo, Moon, etc.) -- if the tools are running compilation of the affected (modified) libraries in a topological order, then tsc -b will double the work.

      1. Is the answer different if we're using tsc only for typechecking?
    2. Is it possible for a monorepo to skip emitting declaration files for internal libraries, have package.json#exports#.#types point to the source .ts? Was this ever considered to be a proper setup? (I tried that but skipLibCheck was not supported -- tsc continued typechecking the internal dependency with parent's stricter compiler settings, which ended up in errors). This would bring DX closer to the one where people use path aliases, otherwise you have to regenerate declaration files for your internal dependencies before you can work on your project in the IDE.

    3. What is the best way of self-referencing root directory of a project in imports (other than path aliases)? This relies on the answer to question 1. If not package.json#imports, then I believe the only way are just ../../relative/paths.

GabenGar commented 6 months ago
  1. package.json#imports is consumed by NodeJS at runtime, so it has to point to the the output .js files in the end. Typescript resolves module paths by the rules in tsconfig.json#compilerOptions.paths instead. And yes, that means you have to almost duplicate path mappings in both files in order for typescript and nodejs to resolve module aliases without problems. In the best case scenario, aka you don't rely on syntax specifics of neither #imports nor paths and just map one module path to a single file, it amounts to prepending dist/ to your #imports paths. And at worst it might not be resolvable.

  2. i.a. "Typechecking" is just running tsc without output files. The compiler still has to statically analyze all module dependencies and therefore must be able to resolve all module paths.

  3. ii. This entirely depends on the monorepo structure. Turborepo specifically separates between applications which build output files and do not get dependent on and packages which declare their source files directly in package.json#exports and are used strictly as dependencies, therefore it does not rely on fancy typescript monorepo magic of partial builds. As far as package.json#imports in a monorepo setting concerned, all other projects with package.json are "external" and therefore do not belong in the #imports mapping.

Andarist commented 6 months ago

And yes, that means you have to almost duplicate path mappings in both files in order for typescript and nodejs to resolve module aliases without problems.

That's not completely true. You are using output paths in package.json#imports and as long as those are not different from what TS would emit then you can rely on TS to resolve the source files by mapping patterns defined by package.json#imports to their respective sources.

An example test case for this can be found here. Notice how package.json#imports refer to .js files in dist directory but yet TS is able to offer auto-completions etc within src directory. That happens by matching package.json#imports against tsconfig.json#compilerOptions.rootDir and tsconfig.json#compilerOptions.outDir (don't quote this is an exhaustive description of the algorithm 😉 )

akwodkiewicz commented 6 months ago

An example test case for this can be found here.

@Andarist , wait so you say that it is possible to do the conditional import?

If that's true, then I have to try once more myself and cross out the paragraph in my previous message.

Turborepo specifically separates between applications which build output files and do not get dependent on and packages which declare their source files directly in package.json#exports and are used strictly as dependencies, therefore it does not rely on fancy typescript monorepo magic of partial builds.

Thank you for mentioning this, @GabenGar. Actually, I first saw the idea of using the source files as the type source in a blog post by Turbo. But they also declared main as *.ts files, which made no sense to me, as the .ts files are not understandable by Node. I assumed they are relying heavily on some bundler.

And like you said, they rely on applications being the "leaves of the topological tree", being the projects that actually do the building. This means that the compiler options in apps cannot be "stricter" than the compiler options for the libraries, because the library is ultimately compiled with the dependent applications' compiler options. This might seem like a tiny detail, but it matters when we want to gradually introduce some stricter compiler options to a monorepo, project by project, where each project/lib can be developed by a separate team. That was one of the pitfalls I discovered when using .ts files as package.json#types/package.json#exports['.*'].types.

Typescript resolves module paths by the rules in tsconfig.json#compilerOptions.paths instead. And yes, that means you have to almost duplicate path mappings in both files in order for typescript and nodejs to resolve module aliases without problems.

@GabenGar , that was the only way before it was announced that the package.json#import is now supported by TS.

Andarist commented 6 months ago

@Andarist , wait so you say that it is possible to do the conditional import?

Sure thing. Conditions are supported both in exports and imports.

akwodkiewicz commented 6 months ago

Supported since version 4.7? It's only the auto-completion that will be added in 5.4, right?

Andarist commented 6 months ago

Yes, the module resolution part of things was released in 4.7 (release note here). 5.4 just offers new auto-completions in this area

GabenGar commented 6 months ago

@Andarist

An example test case for this can be found here.

The key word in the underlying PR is "roughly". Since paths and #imports use different ways to map several files to a path, it will only "just work" if the mapping is one-to-one, which can easily not be the case in situations like a bundled NodeJS package (and I bet the errors will be as confusing as always). So the point still stands, #imports field is still a footgun for typescript codebases, with ugly workarounds.

@akwodkiewicz

I assumed they are relying heavily on some bundler.

While technically true, it's very unlikely you'll end up in a monorepo situation without all js code lathed in typescript (which might or might not be a bundler) and without a build step.

akwodkiewicz commented 6 months ago

So the point still stands, #imports field is still a footgun for typescript codebases, with ugly workarounds.

It seems we happen to work on projects with entirely different DX -- hence the disagreement.

We usually seem to be stuck between choosing solutions that either require lots of configuration or those that are super magical and in case of a non-standard issue require reading their source code to understand what they are doing.

I'm searching for a minimal, but a non-magical approach, where we are as compliant with various tools as we can get. And I'd like the TS docs to help other people get there as well.

And I believe that #imports are a "native" alternative to the path aliases, at least for Node, because they just provide the "type layer" on the stuff that is already supported by the runtime. So no bundler magic, no unnecessary configuration -- it's the sweet spot.

Let me go back to the original issue, which is about the docs themselves. The aforementioned path aliases, that 5 years ago were not explained well enough and made people (ab)use them as import syntax-sugar, today they have a single essential paragraph explaining they should not be used to reference other packages in a monorepo. It's great that this particular piece of information is now in the docs, and it's great that Yarn/pnpm workspaces are mentioned as the alternative.

What I don't get though, is that the self-referencing path aliases seem to be endorsed in the next subsection of the same document. The docs could be more specific that this is probably a good idea only if you work with an additional build step other than just tsc. And here we could be mentioning #imports as the alternative, in the same way we are mentioning Yarn workspaces as the alternative for aliases to internal libs.

Moreover, I know that the information about paths not being emitted is 2 sections above, but since the documentation is not separated into "vanilla tsc + Node/browser" and "things that make sense if your bundler allows it", reading the "wildcard pattern" section alone can still mislead developers into using the path aliases without understanding their consequences.

EDIT: what's also confusing is the part where baseUrl stopped being necessary for the path aliases to work. As a user of path aliases I obviously thought it's a good idea to reduce unnecessary "baseUrl: '.'" entry in my config. But today, knowing about the original idea behind the feature, I'm wondering if it was a good idea that we simplified the usage of the feature that at the same time we try to warn people about (did it make sense to drop baseUrl in the context of usage with RequireJS? I assume not)