FluxML / MLJFlux.jl

Wrapping deep learning models from the package Flux.jl for use in the MLJ.jl toolbox
http://fluxml.ai/MLJFlux.jl/
MIT License
143 stars 17 forks source link

`MultitargetNeuralNetworkRegressor` has surprising `target_scitype` #246

Closed dpaetzel closed 8 months ago

dpaetzel commented 8 months ago

Hi! Thank you for creating and maintaining this project!

I'm rather new to Julia, so this may be a dumb question (sorry in that case).

I currently have some issues getting MultitargetNeuralNetworkRegressor to work properly using a custom layer and loss function. In the process of debugging that, I noticed, that even without my custom layer, the target_scitype does not conform to the one given in the README:

using Flux
using MLJ
using MLJFlux

chain = Chain(Dense(3 => 10, relu), Dense(10 => 3, relu))
builder = MLJFlux.@builder chain

model = MultitargetNeuralNetworkRegressor(;
    builder=builder,
    loss=Flux.mse,
    epochs=2,
    batch_size=32,
)

target_scitype(model)

yields Union{Table{<:AbstractVector{<:Continuous}}, AbstractMatrix{Continuous}}.

I'm a bit confused about the nesting of Table and AbstractVector here; am I doing something wrong or is my mental model off?

dpaetzel commented 8 months ago

(I'm referring to this line in the README based on which I'd expect just a <: Table(Continuous).)

ablaom commented 8 months ago

This is not a bug, but your question is a good one. I think it is answered here. You can ignore the convention C, which is always the default one in MLJ.

dpaetzel commented 8 months ago

I see, thank you! The link clarified it for me. :slightly_smiling_face: