colinhacks / zod

TypeScript-first schema validation with static type inference
https://zod.dev
MIT License
34.04k stars 1.19k forks source link

Incorrect Type Inference for Non-Optional Field with z.unknown() #2966

Open dilame opened 1 year ago

dilame commented 1 year ago

I encountered an unexpected behavior in Zod's type inference system when using z.unknown(). The issue arises when defining an object schema with a field that should be required, but the inferred TypeScript type incorrectly marks it as optional.

import { z } from 'zod';

const test = z.object({
  shouldExist: z.unknown(),
});
type Test = z.infer<typeof test>;
// Expected: { shouldExist: unknown }
// Actual: { shouldExist?: unknown } (Incorrect, field should not be optional)

Expected Behavior: The expected behavior is for the type Test to be inferred as { shouldExist: unknown }, indicating that shouldExist is a required field of unknown type.

Actual Behavior: The actual inferred type is { shouldExist?: unknown }, incorrectly suggesting that the shouldExist field is optional.

JacobWeisenburger commented 1 year ago

In Zod, any thing that can be undefined is considered to be optional. Since undefined extends unknown, then Zod treats it as optional.

Either of these should fix the runtime behavior for you, but unfortunately it won't fix the types.

const schema = z.object( {
    shouldExist1: z.unknown().refine( x => x !== undefined, 'Required' ),
    shouldExist2: z.custom<unknown>( x => x !== undefined, 'Required' ),
} )

const result = schema.safeParse( {} )
result.success
    ? console.log( result.data )
    : console.log( result.error.issues )
// [
//     {
//         code: "custom",
//         message: "Required",
//         path: [ "shouldExist1" ]
//     }, {
//         code: "custom",
//         message: "Required",
//         fatal: true,
//         path: [ "shouldExist2" ]
//     }
// ]
dilame commented 12 months ago

I respectfully disagree with the classification of this issue as not-intuitive-behavior. The behavior where z.unknown() in a Zod object schema infers an optional field is in direct contradiction with TypeScript's type system expectations. In TypeScript, { a: unknown } and { a?: unknown } represent fundamentally different concepts: the former is a required field, whereas the latter suggests an optional one. This distinction is not only crucial for type-checking and data integrity but also affects runtime behavior significantly.

To illustrate, in runtime, 'a' in {} and 'a' in { a: undefined } yield different results, underlining the importance of this distinction. The current Zod implementation seems to overlook this aspect, leading to potential confusion and bugs, especially in scenarios where strict type adherence and runtime behavior are critical.

While I appreciate the suggested workarounds using z.unknown().refine or z.custom for runtime solutions, they do not address the core issue of TypeScript compile-time type inference. This discrepancy is more than just a non-intuitive-behavior; it represents a deviation from the established TypeScript standards.

Considering the importance of consistency with TypeScript's type system and the practical runtime implications, I believe this issue should be recognized as a bug. Addressing it accordingly would align Zod more closely with TypeScript standards and provide a more predictable and reliable experience for developers.

JacobWeisenburger commented 12 months ago

Very well written argument. You have won me over. Thank you for being respectful too. It goes a long way.

schicks commented 8 months ago

Silly workaround for the type level problem for now;

const Parser = z.object({
    key: z.unknown()
}).transform(({key}) => ({key}))

Because zod then treats the resulting type as the type after passing through the transformer, it correctly picks up the typescript behavior of treating the key as non optional since it was explicitly provided. This has the opposite issue of the workarounds @JacobWeisenburger presented; I think it would allow through values that do not have the key. To get both, you would need to first refine the object to confirm that the key was present, then transform to get the type to admit that.

shaharke commented 7 months ago

I would like to note that this also results in incorrect parsing behaviour. The following code will run without errors even though value is a required preoprty:

const Success = z.object({ value: z.unknown() });
Success.parse({ foo: "bar" });
cimchd commented 3 months ago

In case someone else still has this problem (like me today), I ended up defining the unknown field optional:

import { z } from 'zod';

const test = z.object({
  field: z.unknown().optional(),
});
type Test = z.infer<typeof test>;
// Expected: { field?: unknown }
// Actual: { field?: unknown }

This solved for me all type inconsistences after using z.infer.

freddie-freeloader commented 3 months ago

@cimchd The whole point of this issue is that you don't want the field to be optional.

cimchd commented 3 months ago

@cimchd The whole point of this issue is that you don't want the field to be optional.

I know. I just wanted to leave this workaround here in case it helps someone else. I don't know if this will ever change in zod.

flex-hyuntae commented 2 months ago

I aggree this In TypeScript, { a: unknown } and { a?: unknown } represent fundamentally different concepts

flex-hyuntae commented 1 month ago

same issue?? https://github.com/colinhacks/zod/issues/1628#issuecomment-2388637028

dmeehan1968 commented 1 month ago

This also seems to trip up .required(). Docs: " the .required method makes all properties required."

const schema = z.object({
  field: z.unknown()
}).required()
// Expected: { field: unknown }
// Actual: { field?: unknown }

In this case, required has not resulted in field being required.

I'm also with @dilame in that just because unknown accepts undefined does not make it optional.

NB: z.any() also exhibits this undesirable behaviour.