00:00 Welcome!
00:31 Quick overview of Flux.jl
01:07 Quick glimpse on Flux.jl internals
01:30 Summer of Code with Flux.jl
01:57 Reading ONNX files with ONNX.jl
02:39 Images recognition models
02:57 Speech recognition model with CUDA
03:35 Demo of speech recognition
04:54 Reinforcement learning and AlphaGo.jl
05:10 Exporting Flux.jl to the browser with FluxJS.jl
07:03 Cheers for people from Summer of Code
07:17 Planes for future: ML for us is compiler problem
07:41 Our main compiler problem: automatic differentiation (AD)
08:32 AD normally needs expression trees
09:18 Can we do better? Our answer: Zygote.jl
11:12 Taking derivative and Julia IR (Internal Representation)
13:31 Benchmarking more complex examples
14:43 Speed, but at what cost?
16:20 Defining custom gradients
17:54 Convenient error messages
18:38 We have fully dynamics AD
19:35 In Julia we can just hack compilers with different tricks, when we need it
20:03 Demo of simple derivative
21:19 Question: what is that arrow?
21:55 Q&A: can you differential different function that number to number (scalar to scalar)?
22:09 Comment: removing of stack in some cases
23:14 Q&A: what are the Zygote.jl limitations right now?
24:20 Q&A: what is relation of Zygote.jl and Casset.jl?
25:15 Q&A: can we use Zygote.jl to differentiate function from one parameter to many parameters?
26:12 Annulment of Flux.jl on hackathon
Flux: The Elegant Machine Learning Library | Mike Innes
Flux: The Elegant Machine Learning Library, https://www.youtube.com/watch?v=R81pmvTP_Ik
00:00 Welcome! 00:31 Quick overview of Flux.jl 01:07 Quick glimpse on Flux.jl internals 01:30 Summer of Code with Flux.jl 01:57 Reading ONNX files with ONNX.jl 02:39 Images recognition models 02:57 Speech recognition model with CUDA 03:35 Demo of speech recognition 04:54 Reinforcement learning and AlphaGo.jl 05:10 Exporting Flux.jl to the browser with FluxJS.jl 07:03 Cheers for people from Summer of Code 07:17 Planes for future: ML for us is compiler problem 07:41 Our main compiler problem: automatic differentiation (AD) 08:32 AD normally needs expression trees 09:18 Can we do better? Our answer: Zygote.jl 11:12 Taking derivative and Julia IR (Internal Representation) 13:31 Benchmarking more complex examples 14:43 Speed, but at what cost? 16:20 Defining custom gradients 17:54 Convenient error messages 18:38 We have fully dynamics AD 19:35 In Julia we can just hack compilers with different tricks, when we need it 20:03 Demo of simple derivative 21:19 Question: what is that arrow? 21:55 Q&A: can you differential different function that number to number (scalar to scalar)? 22:09 Comment: removing of stack in some cases 23:14 Q&A: what are the Zygote.jl limitations right now? 24:20 Q&A: what is relation of Zygote.jl and Casset.jl? 25:15 Q&A: can we use Zygote.jl to differentiate function from one parameter to many parameters? 26:12 Annulment of Flux.jl on hackathon