drewrip / rascal

1 stars 0 forks source link

Type inference of numeric literals is limited #12

Open drewrip opened 3 weeks ago

drewrip commented 3 weeks ago

I just finished adding the last few pieces to make the full suite of expected primitive numeric types work. This includes i32, i64, u32, u64, f32 and f64. I made the choice to limit the degree to which integer and floating point literals can be infered.

For example:

let x: float32 = 0.4;
let y: float32 = x * 0.5;

won't pass type checking. This is because any floating point literal that doesn't explicitly have the f32 or f64 suffix is assumed to be a float64. Thus x * 0.5 is float32 * float64. This could be fixed by making the expression x * 0.5f32. While this is more explicit, it is also more verbose. This same issue exists with the integer types.

I think there could be an alternative where an unspecified literal like 0.5 takes on a generic FloatingPoint during parsing, then has it's true type asserted when it is encountered during type checking.

THE-COB commented 1 week ago

if there's a generic FloatingPoint, could there also be a generic IntegerType? This feels better than masking because it could get more complicated if we wanted to support even more sizes of types or something. We'd also have to support similar logic for ints regardless. idrk what I'm talking about but those are my thoughts

drewrip commented 1 week ago

Yeah for sure. I think having an IntegerType too makes a lot of sense. In general any primitive types that act the same but have varying widths (ie 8/16/32/64-bit signed integers) should probably have some more general type that they start with (the inference needs to figure out which is valid). I'm currently working on making the type system more robust, so this would probably be easier to implement with some of those features in place.

For whatever reason I was questioning whether you'd want this behavior at all, since then you could no longer know the type of a literal immediately by looking at it. However that's silly; I think inferring the type of a literal is much more useful in the end :)

THE-COB commented 1 week ago

Yeah for sure. I think having an IntegerType too makes a lot of sense. In general any primitive types that act the same but have varying widths (ie 8/16/32/64-bit signed integers) should probably have some more general type that they start with (the inference needs to figure out which is valid). I'm currently working on making the type system more robust, so this would probably be easier to implement with some of those features in place.

For whatever reason I was questioning whether you'd want this behavior at all, since then you could no longer know the type of a literal immediately by looking at it. However that's silly; I think inferring the type of a literal is much more useful in the end :)

yeah I feel like in the case where you'd wanna see the individual sizes you'd probably be better off just deferring to larger sizes to avoid potential overflow and stuff