Closed gdalle closed 1 month ago
Thanks for the experience.
I'm still traveling but I hope to look into what you did afterwards, no promises though. See you likely at the next JuliaCon (global).
Alain Marcotte
Avant tout le respect: de soi, des autres, de l'environnement
Le mer. 17 juil. 2024 à 10:14, Guillaume Dalle @.***> a écrit :
Closed #346 https://github.com/gdalle/DifferentiationInterface.jl/issues/346 as completed via #352 https://github.com/gdalle/DifferentiationInterface.jl/pull/352.
— Reply to this email directly, view it on GitHub https://github.com/gdalle/DifferentiationInterface.jl/issues/346#event-13532748134, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAOAF5ZOSD7AY7QH7EGFQYLZMYRXFAVCNFSM6AAAAABK2CZXAOVHI2DSMVQWIX3LMV45UABCJFZXG5LFIV3GK3TUJZXXI2LGNFRWC5DJN5XDWMJTGUZTENZUHAYTGNA . You are receiving this because you were mentioned.Message ID: <gdalle/DifferentiationInterface.jl/issue/346/issue_event/13532748134@ github.com>
Goal
High-level: Support structures beyond arrays and numbers in DifferentiationInterface.
Low-level: Write tests for taking gradients of neural networks with Flux and Lux.
Steps
DifferentiationInterfaceTest/src/scenarios
calledflux.jl
GradientScenario
involving a very simple neural network built with Flux.jl, for instance the one in this tutorial.layer(input)
with respect tolayer
!!! In other words, for yourGradientScenario
, you will havef(layer) = layer(fixed_input)
as the function (it only applies the layer to a fixed input).DifferentiationInterface/test/Single/Zygote
calledflux.jl
and test your scenario withDifferentiationInterfaceTest.test_differentiation
. Take inspiration from the other test files.layer
is not an array, the returned type will not be an array either: the gradient will be some form of Flux layer as well (I think), so you probably want to compute the ground truth with Zygote at first to see how it is structured.If you need help
Participants