FluxML / DataAugmentation.jl

Flexible data augmentation library for machine and deep learning
https://fluxml.ai/DataAugmentation.jl/dev/
MIT License
41 stars 18 forks source link

Support for GPU-accelerated affine transformations #48

Open lorenzoh opened 3 years ago

lorenzoh commented 3 years ago

Affine transformations on CPU are already pretty fast, but GPU-accelerated transformations are useful in CPU-constrained environments like Google Colab where a GPU is available but only 2 mediocre CPUs which can lead to a bottleneck in the data pipeline.

Some open questions:

Can someone who has experience with image transformations on GPU chime in?

@jsamaroo

ToucheSir commented 2 years ago

Think you meant to tag @jpsamaroo? Also, https://github.com/JuliaImages/ImageTransformations.jl/pull/156 looks promising here!

lorenzoh commented 2 years ago

Think you meant to tag @jpsamaroo?

Yup 😅

jpsamaroo commented 2 years ago

Writing image transformations with KernelAbstractions.jl is probably the best approach, since you get CPU+CUDA+AMDGPU from just a single kernel (although you might want to write the CPU kernels differently for better performance). That sort of code should probably go in ImageTransformations.jl in some shape or form.