locuslab / torchdeq

Modern Fixed Point Systems using Pytorch
MIT License
82 stars 10 forks source link

What is the backward pass in DEQ? Why do we use a fixed-point solver here? #5

Closed BraveDrXuTF closed 6 months ago

BraveDrXuTF commented 6 months ago

In all parameters needed by get_deq function, I find the backward pass b_solver and b_max_iter, what are they used for? As far as I know, the backward pass refers to backpropagation, and it does not need a fixed-point solver...

BurgerAndreas commented 6 months ago

Have a look at the tutorial colab notebook. It explains the different ways to compute gradients quite nicely

BraveDrXuTF commented 6 months ago

Thank you, I think I got it.