Open mikex-oss opened 2 months ago
Related: https://github.com/google/xls/issues/691
The notable additional feature mentioned there is the ability to join
on such a token array.
We've talked about this a bit, and even played around with a possible implementation, but I think the conclusion is we are not likely to support a syntax like update(send_toks, i, send(tok, chan[i], x))
, because tokens can't be aliased. With today's token model, I think there would need to be a map construct like:
let tokens = const map channel: chan<u32> in enumerate(channels) {
send(join(), channel, data)
};
which would have an expansion like:
let token__0 = send(channels__0, data);
let token__1 = send(channels__1, data);
etc.
and no underlying arrays at all.
However, I think fundamentally the desire to put tokens in an array kind of pushes against their nature as more code labels (or IR node labels) than real variables. Either we should turn them into variables that can be aliased, or we should turn them into something like parameterizable code labels, as in:
unroll_for!(i, _): (u32, ()) in u32:0..N {
send_of_x(i):: send(chan[i], x);
}()
unroll_for!(i, _): (u32, ()) in u32:0..N {
after send_of_x(i):: let (data, _) = recv(chan[i]);
}()
Either way, because there is not a strong immediate need (and the above example just begs the question of why not use a unified loop), I think this will be deferred until we look at how implicit tokens and the like might be done.
I think an alternative approach would just be to make tokens valid subjects for array operations. As currently implemented, if XLS were to allow this, the result will be semantically equivalent to aliasing tokens in circumstances where XLS's optimizations can see through the operations at compile time, with a fallback to effectively joining the tokens that could have been aliased when it's not resolvable at compile time.
Pros:
Cons:
We have played around with allowing lowering of update(send_toks, i, send(tok, chan[i], x))
and send_toks[i]
. Without the notion that send_toks[i]
evaluates to a join of everything it might be, appropriate front-end guard rails are hard to install. Even with that idea, it feels to me like a broken metaphor that we have time to think about how to fix, i.e. this is far from being the biggest problem with tokens.
What's hard to do? (limit 100 words)
Trying to take advantage of channel arrays now that #704 is fixed.
However, I think I'm blocked by the fact that I can't keep track of tokens across the channel array separately. The test example as part of delivering the feature only shows sequential tokens:
https://github.com/google/xls/blob/b117a11ef67534bba97f8562b024278fb9ba7ee5/xls/dslx/ir_convert/proc_config_ir_converter_test.cc#L305-L311
However, if I'd like each channel in the array to operate in parallel, I want to save all those tokens so that I can sequence downstream sends/receives based on the specific token element.
I tried out token arrays and it first fails on:
I tried hacking
BuiltinTypeAnnotation
to define a signedness for tokens, but then it just trips over thinking that it is asN
oruN
type:Current best alternative workaround (limit 100 words)
I'm not aware of any workaround aside from keeping the sends/receives manually unrolled.
Your view of the "best case XLS enhancement" (limit 100 words)
Would like to be able to do something like:
Could something like how we handle channel arrays be applied to tokens as well?