JuliaML / MLUtils.jl

Utilities and abstractions for Machine Learning tasks
MIT License
107 stars 20 forks source link

Accomodate `Base.stack` #119

Closed ToucheSir closed 1 year ago

ToucheSir commented 2 years ago

https://github.com/JuliaLang/julia/pull/43334 recently landed, so now using MLUtils causes issues downstream in Flux. Not sure which level of the stack is better to address this in.

mcabbott commented 2 years ago

Can this simply be deleted, and add using Compat for old Julia versions?

The Base one covers more cases, but with dims they seem to agree:

julia> mm = [rand(1:99, 2, 3) for _ in 1:4, _ in 1:5];

julia> Base.stack(mm) |> size
(2, 3, 4, 5)

julia> Flux.MLUtils.stack(mm) |> size
ERROR: UndefKeywordError: keyword argument dims not assigned

julia> Base.stack(mm; dims=2) |> size
(2, 20, 3)

julia> Flux.MLUtils.stack(mm; dims=2) |> size
(2, 20, 3)

Edit: maybe this is the problem:

julia> Flux.MLUtils.stack(mm; dims=5) |> size
(2, 3, 1, 1, 20)

julia> Base.stack(mm; dims=5) |> size
ERROR: ArgumentError: cannot stack slices ndims(x) = 2 along dims = 5

Base.stack disallows that in the name of type-stability. (It could perhaps be weakened if someone finds a way, perhaps by encouraging constant propagation or something. Or by accepting dims = Val(5) like cat does.)

One path for now would be to define a new much simpler MLUtils.stack, which is just reshape(Base.stack(x; dims), ...) to allow other dims. That ought not to be breaking.

mcabbott commented 1 year ago

This package currently exports stack. That can't remain on 1.9. Is it breaking to remove it?

Or is exporting Compat.stack a close enough substitute? I see now that, in addition to the method above with keyword dims=2, you can write Flux.MLUtils.stack(mm, 2) with the same meaning. That conflicts with Comapt.stack(f, xs).

mcabbott commented 1 year ago

BTW, Base.stack also matches this method of batch

julia> batch([1:2, 3:4, 5:6])
2×3 Matrix{Int64}:
 1  3  5
 2  4  6

which is itself a little strange. Wouldn't it be more consistent for that to match the other methods of batch, which reverse the containers, like so?

julia> batch([(1,2), (3,4), (5,6)])
([1, 3, 5], [2, 4, 6])

julia> batch([Dict(:a=>1,:b=>2), Dict(:a=>3,:b=>4), Dict(:a=>5, :b=>6)])
Dict{Symbol, Vector{Int64}} with 2 entries:
  :a => [1, 3, 5]
  :b => [2, 4, 6]

So perhaps that's another candidate for related breaking changes. And maybe this shouldn't be called batch either? It's similar to this: https://github.com/JuliaData/SplitApplyCombine.jl#inverta

CarloLucibello commented 1 year ago

So perhaps that's another candidate for related breaking changes.

In some sense batch is the inverse of getobs and its behavior is consistent with that


function roundtrip(xs)
  y = batch(xs)
  xs2 = [getobs(y, i) for i in 1:numobs(y)]    # equivalent to unbatch(y)
  @assert xs == xs2
end

roundtrip([1:2, 3:4, 5:6])
roundtrip([(1,2), (3,4), (5,6)])
roundtrip([Dict(:a=>1,:b=>2), Dict(:a=>3,:b=>4), Dict(:a=>5, :b=>6)])

so we should keep things as they are

mcabbott commented 1 year ago

Ok. If the intention is to be this inverse, can this be clearly documented somewhere? All I can see is

https://juliaml.github.io/MLUtils.jl/dev/api/#MLUtils.batch

I also think what exactly getobs does should be explained somewhere very prominent. Its docstring seems very close to circular,

"Note that idx can be any type as long as data has defined getobs for that type"

or else explains what you hope for when someone extends this

"observation(s) should be in the form intended to be passed as-is to some learning algorithm"

without first saying clearly what it actually does on Base types like arrays, tuples, etc.

mcabbott commented 1 year ago

If making another breaking change soon, then why isn't unstack just eachslice? Could that be deprecated in 3 and removed in 4?

Or why would you call that and not unbatch? I guess that's another place where it seems confusing to have so many nearby functions with different names.