FluxML / Flux.jl

Relax! Flux is the ML library that doesn't make you tensor
https://fluxml.ai/
Other
4.46k stars 603 forks source link

Conv with circular padding #1917

Open cossio opened 2 years ago

cossio commented 2 years ago

PyTorch convolution layers admit different modes of padding (https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html#torch.nn.Conv1d): circular (i.e. periodic padding), reflect, replicate and zeros.

From what I understand it seems that Flux (and NNlib.conv) only support zero padding.

It'd be nice to have the other forms of padding too.

ToucheSir commented 2 years ago

See https://github.com/FluxML/NNlib.jl/blob/master/src/padding.jl. Step 1 is to add the new padding routine there.

cossio commented 2 years ago

What's the API to use e.g. pad_reflect as defined there from a Conv layer?

ToucheSir commented 2 years ago

There is none, figuring out how to integrate the two is an active design challenge that nobody has stepped up to tackle yet. In the meantime, you can pad before the conv layer as a workaround.

terasakisatoshi commented 2 years ago

Actually I implemented circular padding functions in my private project. Since it is now publicly available, I'm happy to share my code. See:

https://github.com/AtelierArith/GomalizingFlow.jl/blob/main/src/utils.jl

DhairyaLGandhi commented 2 years ago

A simple way forward is to allow the pad kwarg to accept functions. On top of that, define some lazy versions of padding functions to accept the "config" (dims etc). Then we simply need to run the padding function to get whatever padding we would want.

Conv(..., pad = pad_constant(dims = (1,2))
# where
pad_constant(; kw...) = x -> pad_constant(x; kw...)
DhairyaLGandhi commented 2 years ago

But yeah, the cleanest thing to do is adding padding as a layer before the conv.