Open DetachHead opened 1 week ago
tuple
infer as literal?
a = 1,
a = 2, # error
The first example already doesn't have any error, at least with the "basic" level of type checking.
a = 3 # inferred as Literal[3]
a = a / 2 # no error, the type of 'a' has changed to float
@prashantraina that's because currently it's special-cased to behave that way. This issue is looking to make that special cased behaviour standardised, denotable, and implementable to any types.
currently the way inference works in all type checkers is a bunch of special-cased hueristics that are unclear to the user.
the type system should be able to declare inference rules, for example the type
int
could define its inferred type asint | float
(as a way to fix #319)maybe something like this: