SciML / DiffEqFlux.jl

Pre-built implicit layer architectures with O(1) backprop, GPUs, and stiff+non-stiff DE solvers, demonstrating scientific machine learning (SciML) and physics-informed machine learning methods
https://docs.sciml.ai/DiffEqFlux/stable
MIT License
871 stars 156 forks source link

LV-Univ example code doesn't match intro text #262

Closed metanoid closed 4 years ago

metanoid commented 4 years ago

On the documentation page LV-Univ.md, the introductory text states:

Here's an example of doing this with both reverse-mode autodifferentiation and with adjoints.

However, in the code that follows neither reverse-mode autodifferentiation nor adjoints are explicitly mentioned again.

The text makes it sound like the same example will be run twice with these two different methods, which would be valuable tutorial material. It seems that maybe the code and text are out of sync.

ChrisRackauckas commented 4 years ago

Thanks! Yes, removed. That's just from being an early example. We used to use that to explain how to choose different differentiation methods and make them all work on this kind of example, but now it's just a lot more automatic and there's other examples for sensealg, so this was made to focus on its core. That sentence was removed.