Closed jtotht closed 1 month ago
without looking closely at your code - this isn't just the regular distribution behaviour for conditional types? https://www.typescriptlang.org/docs/handbook/2/conditional-types.html#distributive-conditional-types
This is working as intended and due to the distributive behavior of conditional types, as mentioned by @ritschwumm.
Indeed, I’ve missed that part in the docs. I don’t see why it’s “typically the desired behavior” (an example for where it’s desired would be welcome), but it’s by design, so the issue can be closed. I’ve opened a pull request to fix the library I encountered this is (https://github.com/Liquid-JS/nxt-components/pull/4).
If you think about the case of a normal function that takes some input and does one thing if it's an X and another thing if it's a Y, then what you have there is a distributive application. A function can't tell if it's working on something that's X but could be Y - that's what the nondistributive interpretation is.
If I have a normal function that turns X
s into X′
s and Y
s into Y′
s, and I call it over a series that may contain both X
s and Y
s, then I get a series that may contain both X′
s and Y′
s (rather than either a series that contains only X′
s or a series that contains only Y′
s), don’t I?
That's exactly right, which is why conditional types do what they do by default. This code depends on distributivity, for example:
type Box<T> = { content: T };
type Unbox<T> = T extends Box<infer U> ? U : T;
declare function unBoxArray<T>(arr: T[]): Unbox<T>[];
declare const myStuff: Array<string | Box<number>>;
const output = unBoxArray(myStuff);
// ^?
// output: (string | number)[]
🔎 Search Terms
"conditional type", "array of union"
🕗 Version & Regression Information
⏯ Playground Link
https://www.typescriptlang.org/play/?ts=5.7.0-dev.20240920#code/C4TwDgpgBAsiAq4IB54D4oF4ryhAHsBAHYAmAzlAMoTDICGxIGA-DgNoC6UAXB5wG4AUENCQowLLARJkxAK4BbAEYQATgB8FK9VzQCoAekNQA5rUoEwjUhFISA9lG2q1XKBudLXXLqKQSAIxSABQu6lreupwAlHiEJBTUtAxMrFBhUZrhbrHufJk62VlcMVwGxmYW8dZkdo4ZOZFFpX62AMYANvRq0O0OxOSS9HzAgcL9g5LKo1L0wpVQSwB+QotLiOIA5IWuHl4teZxbUACWlMQOw+Tkp6bE9Mqd0MBOYtBbwFsAdCLvUPIpHBNigmkM1KdiKZ9EYTOZgJZ8LVbPZXgcfNxPODIaY-P95MFsLsItioXECEQyJQaHRGMwoGxiZpSaYytwCmDgBCyeVYVUETUbPU0UyNCy2UIOt1elBJkNZXwCRMBvLSIqpO0FiYllBVuscAEdiz9jk2Sdzs4rlB6Dc7g8ni83ob5D8gA
💻 Code
In the following example,
t1
andu1
are manual expansions of the typest
andu
respectively: I simply copied the declaration ofMyType<T>
and replaced all appearances ofT
with the (properly parenthesized) actual type parameter. (Both thet
/t1
and theu
/u1
examples are enough to demonstrate the seemingly same issue, but I included both of them, in case the root cause is slightly different.)🙁 Actual behavior
Compilation errors.
🙂 Expected behavior
The code compiles:
MyType<...>
is also expanded to an array of unions, not to a union of arrays. As I wrote above, the{t,u}1
types are really just manual expansions of the type, so they should behave the same.Additional information about the issue
I tried to check existing (open and closed) issues, but there are so many that I may have missed a duplicate. Sorry in advance if that’s the case.