Closed odow closed 2 months ago
I think this is very natural. Here you could use config to convert certain types of activation functions to reduced space? I.e. config = Dict(Flux.relu => MathOptAI.ReducedSpace(MathOptAI.ReLU()))
. This may be a large portion of what people want in terms of "hybrid" formulations, and they can do the rest manually if they want to get more custom.
Yes, exactly.
One question is whether we want MathOptAI.build_predictor(extension; kwargs...)
or to explicitly construct the built-in layers like MathOptAI.Pipeline(extension)
.
I'm leaning towards the latter, even though it makes documenting a bit more verbose, because you'll need to know what the extension is targeting. But it also means that we could have multiple ways of formulating the same extension.
Not sure I follow. What would build_predictor
do?
Return an AbstractPredictor
of the implementation's choice. The upside is that there would be one way to call things. The downside is that there is less control over what gets returned.
For example:
""" function MathOptAI.Pipeline( predictor::Flux.Chain; config::Dict = Dict{Any,Any}(), ) inner_predictor = MathOptAI.Pipeline(MathOptAI.AbstractPredictor[]) for layer in predictor.layers _add_predictor(inner_predictor, layer, config) end return inner_predictor end