Closed mohammadfawaz closed 11 months ago
Some more comments from @otrho on the above from a Slack discussion:
So if a macro doesn't have a return value, how is it 'called'? At the moment we expect a function to return a value as a part of either a let initialiser, which is a constraint in itself, or in a constraint expression. Is it enough to just have it in a constraint like
constraint my_macro(a, b, c);
Yeah, that'd work. And obviously macros which don't return anything can't be used in a let constraint since it wouldn't type check. Hrmm, needs some more thought.
I've been exploring this idea a bit more @mohammadfawaz.
The idea that functions don't have to be expressions, or at least return one, works pretty well if you just want to batch apply some constraints to some parameters.
fn is_even_and_in_range(x: int, low: int, high: int) {
constraint x % 2 == 0;
constraint x >= low;
constraint x <= high;
}
But this implies you'd want to use it in the top level context, like a declaration:
let a: int;
is_even_and_in_range(a, 10, 20);
We don't allow function 'calls' to be at the top level; they aren't Decl
s. But I'm thinking they should be, so we can do the above?
And maybe calling them should have a different syntax to calling extern functions for state, e.g., like Rust's is_even_and_in_range!(a, 10, 20);
Still working on this, have actually been implementing inlining functions. And the more I work on it, and the more tests I write as I go, the more I feel that we simply shouldn't have 'functions' and truly only support macros.
In a declarative language with no control flow we're not making 'calls', as a call traditionally implies jumping from where you are now to some other code and then returning from it... all control flow. All our functions will just be inlined, which is an expansion (or beta-reduction, in the PLT sense).
Practically, if we make functions macros we can drop the return type altogether (no parsing of -> type
), and maybe drop argument types too, and we can perform inlining/expansion before type checking. This will make the type checker a bit simpler, far less to check.
But, like other languages (e.g., Rust) if there is a type error originating from a macro expansion it can be hard to report. The type error is in code in the macro (because it was applied to the wrong type(s) of arg(s)) but by the time we're type checking we'll be reporting the error at the 'call' site, rather than in the macro. We can however keep this in mind while implementing expansion and do our best to report type errors at the call site and also within the macro.
But also, by having untyped macros instead of functions we can add a sense of generality. Poor man's generics.
E.g.,
macro array_in_bounds(nm, ty, mn, mx) {
let nm: ty[3];
constraint nm[0] >= mn && nm[0] <= mx;
constraint nm[1] >= mn && nm[1] <= mx;
constraint nm[2] >= mn && nm[2] <= mx;
}
array_in_bounds(ai, int, 0, 5);
array_in_bounds(ar, real, 0.0, 5.0);
This is a crap example, but it's basically suggesting a proper macro system like from Rust or Lisp is probably worthwhile in a declarative language. We are essentially adding compile time evaluation, and could have constructs like looping within the macro system, so e.g., take a look at basic_tests/golf.yrt
. Instead of all those constraint decls it could be the following:
macro array_in_bounds(nm, ty, sz, mn, max) {
let nm: ty[sz];
for i 0 sz {
constraint nm[i] in mn..max;
}
}
array_in_bounds(h, int, 18, 3, 5);
Or more realistically you'd just declare h
and use a for loop without a new macro. But the for
loop itself is a macro and is obviously evaluated at compile time.
While I like having macros as you explain above, I do like the "function calls" as expressions. For example, the standard library may have a "function" called sum
that takes an array and returns the sum of its elements (I guess we haven't talked about the semantics of array types). Do you imagine sum
here to also be a macro? I guess my question is: can we still achieve the same functionality and the same ergonomics of regular functions calls with the macro system above?
I think it makes sense to have sum(ary)
as a macro which expands into sum[0] + sum[1] + ...
. It is fairly straight forward and sum
itself could be written in a macro language given we have a way to get the array size at compile time.
macro sum(ary) {
sum_helper(ary, 0)
}
macro sum_helper(ary, i) {
if i < len(ary) {
ary[i] + sum_helper(ary, i + 1)
} else {
0
}
}
So
let a: int[3] = [1, 2, 3];
let b = sum(a);
expands to
let a: int[3] = [1, 2, 3];
let b = ary[0] + ary[0 + 1] + ary[0 + 1 + 1] + 0;
The if
would have to be evaluated/expanded at compile time otherwise we'd not hit the end of recursion, which is a bit weird. Hrmm. But it's powerful.
And the beauty is sum
isn't typed. So ary
could be either numeric type, int
or real
, as long as +
works for it.
If sum
was a typed function we'd need sum_int_array
and sum_real_array
or something. OTOH, if sum
is called with an array of strings then we'd get a type error about +
not working, which would need work to not be cryptic.
The alternative, I think, is to have standard library 'functions' like sum
be a part of the spec and just emitted in the final intent, and for solvers to know what to do with it. So then we have to spec it out properly for solvers so there's no ambiguity (OK, sum
is simple, but if we have a whole library some will be more complex...) and start to think about backward compatibility/support/versioning.
The more I work on this and think about it, the more I'm convinced we should not have functions and we should only have macros.
And I think we should make block expressions more first-class, and use them in if
/else
and macros when they want to return an expression alongside some other decls.
Two main points -- our functions are already essentially statically typed macros, especially if/when we allow them to be called from anywhere; and macros would allow generics to a degree, enough I think to service the needs of Yurt devs.
Treating this issue as the "Spec" issue. @otrho already has a branch with the spec changes.
From @otrho