FluxML / DataAugmentation.jl

Flexible data augmentation library for machine and deep learning
https://fluxml.ai/DataAugmentation.jl/dev/
MIT License
41 stars 17 forks source link

integration with Flux #56

Open CarloLucibello opened 2 years ago

CarloLucibello commented 2 years ago

It would be helpful to add to the documentation an example of integration of DataAugmentation.jl in a pure Flux pipeline, e.g. https://github.com/FluxML/model-zoo/blob/master/vision/vgg_cifar10/vgg_cifar10.jl

An alternative is to modify the model zoo example by adding data augmentation, which is quite standard on CIFAR10. If could do that if you can provide some indirections.

lorenzoh commented 2 years ago

Most flexible way to drop DataAugmentation.jl into any workflow is to write a function that augments a single image and use that in the rest of the workflow. For example:

using DataAugmentation

function augmentimage(img, sz; augmentations = DataAugmentation.Identity())
    tfm = RandomResizeCrop(sz) |> augmentations
    return apply(tf, Image(img)) |> itemdata
end

How to integrate with the rest of the workflow depends on what other tools you're using, for example:

Hope this helps, let me know if you need any other pointers.

adrhill commented 7 months ago

As far as I can tell, outputs of DataAugmentations.jl are in Images.jl's HWC (height, width, color-channels) format, whereas Flux generally recommends WHCN, e.g. in the Conv docstring:

Image data should be stored in WHCN order (width, height, channels, batch).

The pretrained models in Metalhead.jl also require inputs in WHCN format.

It would therefore be nice to have Transformations that:

  1. permute the height and width dimensions
  2. add a batch dimension