Open masak opened 3 years ago
APL can actually do "under"... ish. Its Star Diaeresis (⍣) operator repeats an operation N times, but the number of times can actually be ¯1.
It's gonna be really hard to show a legible example, because of how APL reads in general, but I'll still try:
(×∘2)⍣¯1⊢3 ⍝ The whole operation
(×∘2) ⍝ A partially applied multiplication operator, in Raku this'd be *×2
⍣¯1 ⍝ Repeated -1 times, inverse of this function
⊢3 ⍝ Applied to 3
This results in 1.5
.
The way you could have "dual" would be to first, apply the "forward" operation, then apply the whole block, then apply the inverse of the "forward" operation.
I'm gonna post a "dual" here. I can detail it, but it's gonna be hard to read no matter what.
⍺⍺
is "function to the left", ⍵⍵
is "function to the right", and ⍵
is "argument to the right".
f←{⍺⍺⍣¯1⊢⍵⍵⍺⍺⍵}
(×∘2)f(+∘2)⊢5
6
(×∘2)
is the function to the left (⍺⍺
), (+∘2)
is the function to the right (⍵⍵
), 5
is the argument to the right (⍵
).
So what happens is this:
(×∘2)⍣¯1⊢2+2×5
Which calculates (2*5+2)/2, 6.
A bit long-winded for a narrow use case, but the sentiment is the same. Dyalog APL Extended, which is an extension to Dyalog APL, has this "under" function as the operator ⍢.
@vendethiel Thank you for the explanation. It's good to know about prior art. I agree APL is somewhat opaque, but (thanks to your explanations), I think I actually got it.
It feels to me the under
macro is doing something more, semantically. The APL version passes around values fairly to transform forwards-and-backwards fairly explicitly. The under
macro implicitly injects/wraps forward
and backward
calls around values which cross in and out (respectively) of the lexical environment of the under
block.
I don't recall seeing that technique before; the closest thing it reminds me of is our treatment of free variables in quasi
blocks (which turn into "direct lookups" per #410). Maybe variable interpolations in Raku regexes could be considered a variant of this pattern, too. under
is lexically changing the "meaning" of free variables — I'm a bit surprised myself at how OK I am with that. It feels slightly related to exists
(#292) and delete
(#290) in that it "changes how we evaluate something" (but via a syntax transformation).
So, I was checking out the examples, and came to the implementation of Nim addition:
It got me thinking about the ideal shape this code "wants" to have. And suddenly a feature jumped out at me that I guess I've been half-thinking of for years — doing some calculation "under a transform":
Briefly:
BitVectorTransform
is a class, with two static methods:forward
andbackward
.BitVectorTransform<Outer, Inner>
, and the two methods as having signaturesforward(o: Outer): Inner
andbackward(i: Inner): Outer
.under
block gets code-generated to a regular block statement containing the original code, except variables declared outside this block are wrapped withforward
orbackward
calls — depending on whether they are reads or writes, respectively.zipPadLeft
is kind of a bad idea (but the problem is genuine, and I encountered something very similar in the samefringe musing).The
BitVectorTransform
class looks like what you'd expect:Technically, we should be able to skip the intermediate
result
variable, and just lean on the fact that a returned value also counts as "something passing out of theunder
block":I'm just saying. It's starting to look attractive.
The idea of an
under
block must have been stewing in me for a while. I remember first being impressed by the "doing X under a transform" when I learned that logarithms turn multiplication into addition, and exponentiation into multiplication. There are also things like Fourier and Laplace transforms.Something about the shape of the solution also reminds me of Python's
with
statement. Butwith
is concerned with establishing a dynamic context, trapping control flow going in and out; here we're concerned with establishing a transformation, um, bubble — trapping values going in and out.