lanl-ansi / MathOptAI.jl

Embed trained machine learning predictors in JuMP
https://lanl-ansi.github.io/MathOptAI.jl/
Other
29 stars 1 forks source link

Add a way to return predictor from extensions #74

Closed odow closed 2 months ago

odow commented 2 months ago

For example:

"""
    MathOptAI.Pipeline(
        predictor::Flux.Chain;
        config::Dict = Dict{Any,Any}(),
    )

Convert a trained neural network from Flux.jl to a [`Pipeline`](@ref).

## Supported layers

 * `Flux.Dense`
 * `Flux.softmax`

## Supported activation functions

 * `Flux.relu`
 * `Flux.sigmoid`
 * `Flux.softplus`
 * `Flux.tanh`

## Keyword arguments

 * `config`: a dictionary that maps `Flux` activation functions to an
   [`AbstractPredictor`](@ref) to control how the activation functions are
   reformulated.

## Example

```jldoctest
julia> using Flux, MathOptAI

julia> chain = Flux.Chain(Flux.Dense(1 => 16, Flux.relu), Flux.Dense(16 => 1));

julia> MathOptAI.Pipeline(chain; config = Dict(Flux.relu => MathOptAI.ReLU())

""" function MathOptAI.Pipeline( predictor::Flux.Chain; config::Dict = Dict{Any,Any}(), ) inner_predictor = MathOptAI.Pipeline(MathOptAI.AbstractPredictor[]) for layer in predictor.layers _add_predictor(inner_predictor, layer, config) end return inner_predictor end

Robbybp commented 2 months ago

I think this is very natural. Here you could use config to convert certain types of activation functions to reduced space? I.e. config = Dict(Flux.relu => MathOptAI.ReducedSpace(MathOptAI.ReLU())). This may be a large portion of what people want in terms of "hybrid" formulations, and they can do the rest manually if they want to get more custom.

odow commented 2 months ago

Yes, exactly.

One question is whether we want MathOptAI.build_predictor(extension; kwargs...) or to explicitly construct the built-in layers like MathOptAI.Pipeline(extension).

I'm leaning towards the latter, even though it makes documenting a bit more verbose, because you'll need to know what the extension is targeting. But it also means that we could have multiple ways of formulating the same extension.

Robbybp commented 2 months ago

Not sure I follow. What would build_predictor do?

odow commented 2 months ago

Return an AbstractPredictor of the implementation's choice. The upside is that there would be one way to call things. The downside is that there is less control over what gets returned.