[x] Implement StackPointer support, adding the StackPointer pass. After this change, I'll start to remove a lot of the @inlines. These were added to allow the compiler to avoid heap allocating temporaries, like mutable structs. This will be unnecessary, and we'll be able to rely on LLVM and Julia's inline characteristics.
[x] Transpose the adjoints / autodiff machinery. C * a is much faster than a' * B, where C' = B, C and B are column major matrices, and a is a vector. Currently using the latter instead, because that's what you find in literature on reverse diff.
[ ] Support broadcasting.
[ ] Implement broadcast-reducers, that fuse with broadcasting and reduce. This is for distributions, so that @. y ~ Normal(x * beta + alpha, sigma) will not create any temporaries, and compile down into a single pass that calculates the log density (and optionally, the gradients).
[ ] Allow fusion of matrix multiplication, so that y .~ Norma.(X ⋆ beta, sigma) fuses into a single pass, like the above (but this time, X is a matrix). I figure ⋆ is a reasonable operator for lazy-multiplication that fuses with broadcasts. Note that it is ⋆, i.e. \star[tab], rather than *. I haven't settled on the symbol yet, but picked this one because it should give you the exact same answer as * (for matrix multiplication), except be laxy. Here is the full list of possibilities (operators with the same precedence and associativity as *). Suggestions on symbols are welcome! If we choose something that looks less similar to * (like ×), we could also support @. y ~ Normal(X × beta, sigma) as meaning multiplication. I believe × (\times[tab]) is used for cross products.
[ ] Add good tests with solid coverage to this library and all of its dependencies!
[ ] Add documentation.
[ ] Figure out convenient extension API to make this library easier to hack-on and extend.
[ ] Make sure all models from StatisticalThinking work / can be implemented in a reasonable fashion.
[x] Support getindex.
[ ] Support setindex!.
[ ] Support control flow: if, else, ?, for, and while.
A roadmap for upcoming plans:
@inline
s. These were added to allow the compiler to avoid heap allocating temporaries, like mutable structs. This will be unnecessary, and we'll be able to rely on LLVM and Julia's inline characteristics.Meta.lower
.C * a
is much faster thana' * B
, whereC' = B
,C
andB
are column major matrices, anda
is a vector. Currently using the latter instead, because that's what you find in literature on reverse diff.@. y ~ Normal(x * beta + alpha, sigma)
will not create any temporaries, and compile down into a single pass that calculates the log density (and optionally, the gradients).y .~ Norma.(X ⋆ beta, sigma)
fuses into a single pass, like the above (but this time, X is a matrix). I figure⋆
is a reasonable operator for lazy-multiplication that fuses with broadcasts. Note that it is⋆
, i.e.\star[tab]
, rather than*
. I haven't settled on the symbol yet, but picked this one because it should give you the exact same answer as*
(for matrix multiplication), except be laxy. Here is the full list of possibilities (operators with the same precedence and associativity as*
). Suggestions on symbols are welcome! If we choose something that looks less similar to*
(like×
), we could also support@. y ~ Normal(X × beta, sigma)
as meaning multiplication. I believe×
(\times[tab]
) is used for cross products.getindex
.setindex!
.if
,else
,?
,for
, andwhile
.