Open leios opened 1 year ago
Note that I tried to partially fix the length compilation times by minimizing the number of functions we need to use the @generated
macro on, but these are configured with keyword arguments and those arguments also need a loop, so we we did not save any operations.
Might be time to put some adult pants on and "just do it" by creating my own @generated
macro for Fable: https://github.com/JuliaLang/julia/blob/d49a3c74c97c9ca9ef711d644a3f2b1c02b38d63/base/expr.jl#L1074
Here is @generated
:
macro generated(f)
if isa(f, Expr) && (f.head === :function || is_short_function_def(f))
body = f.args[2]
lno = body.args[1]
return Expr(:escape,
Expr(f.head, f.args[1],
Expr(:block,
lno,
Expr(:if, Expr(:generated),
body,
Expr(:block,
Expr(:meta, :generated_only),
Expr(:return, nothing))))))
else
error("invalid syntax; @generated must be used with a function definition")
end
end
I believe we can just save all the exprs from user fums and splat them all into a big function at the end. I don't know if this will actually cut compile time, but it does...
So we have something like:
@fi x
to mark a variable as special (something like Symbolic Utils)@fum fx
to create custom functions@fo (fx, fx, fx)
to combine everything into a single operator@fee (fo, fo, fo)
to combine everything into the final executableThis was the original vision for the code.
julia> Expr(:escape, :(x=5))
:($(Expr(:escape, :(x = 5))))
julia> macro call_ex()
ex
end
@call_ex (macro with 1 method)
julia> @call_ex()
5
Lots of notes to toss from Slack:
@generated
in Julia. You can do it in C because ptrs are defined in modules that all know each other.@noinline
for @generated
functionsjulia> struct VTable{T}
funcs::T
end
julia> @generated function (VT::VTable{T})(fidx, args...) where T
N = length(T.parameters)
quote
Base.Cartesian.@nif $(N+1) d->fidx==d d->return VT.funcs[d](args...) d->error("fidx oob")
end
end
julia> VT = VTable(((x)->x+1, (x)->x+2))
VTable{Tuple{var"#2#4", var"#3#5"}}((var"#2#4"(), var"#3#5"()))
julia> VT = VTable(((x)->x+1, (x)->x+2))^C
julia> VT(1, 2)
3
julia> VT(2, 2)
4
julia> VT(3, 2)
ERROR: fidx oob
Stacktrace:
[1] error(s::String)
@ Base ./error.jl:35
[2] macro expansion
@ ./REPL[3]:4 [inlined]
[3] (::VTable{Tuple{var"#2#4", var"#3#5"}})(fidx::Int64, args::Int64)
@ Main ./REPL[3]:1
[4] top-level scope
@ REPL[7]:1
julia> @code_typed VT(3, 2)
CodeInfo(
1 ─ %1 = (fidx === 1)::Bool
└── goto #3 if not %1
2 ─ %3 = Core.getfield(args, 1)::Int64
│ %4 = Base.add_int(%3, 1)::Int64
└── return %4
3 ─ %6 = (fidx === 2)::Bool
└── goto #5 if not %6
4 ─ %8 = Core.getfield(args, 1)::Int64
│ %9 = Base.add_int(%8, 2)::Int64
└── return %9
5 ─ invoke Main.error("fidx oob"::String)::Union{}
└── unreachable
) => Int64
Right now, there is a mess of
@generated
functions inrun/fractal_flame.jl
. In principle, these can be removed by...ie, finding some way to iterate through known values with
ntuple
, like: https://github.com/CliMA/ClimateMachine.jl/blob/2e0b6b7d97719e410d12a8596c98d5db7f891dbf/src/Numerics/DGMethods/remainder.jl#L510I couldn't quite figure out how to go from the tuple of ints to an ntuple of vals.
The core issue is that LLVM cannot optimize on the tuple of functions because each function is a unique type, so it cannot be stored as an ntuple to begin with.
This discussion was introduced in #64