DiffEqML / torchdyn

A PyTorch library entirely dedicated to neural differential equations, implicit models and related numerical methods
https://torchdyn.org
Apache License 2.0
1.36k stars 125 forks source link

What is the intuition behind setting parameters in s_span? #64

Closed mrpositron closed 3 years ago

mrpositron commented 3 years ago

Additional Description

I have been using Neural ODEs recently for one particular deep learning task. In this task, I have to do 17 function evaluations, i.e. s_span = torch.linspace(0, x, 17). However, I found that setting x in s_span affects the training. When x is large it learns faster. Thus, could you explain the intuition behind setting x in the s_span.

Example:

image

In the plot below blue is when x = 1 and brown is when x = 5

Thanks!

mrpositron commented 3 years ago

We perform the same number of function evaluations. Thus, I don't get how x affects the training.

massastrello commented 3 years ago

@MrPositron Although it is not possible to answer your question in a concise manner, I would say that the phenomenon you are experiencing is related to the following models choice (which you did not specify):

  1. Vector field type if your vector field explicitly depends on time (i.e. f=f(s, z))), then different s_span in training should likely lead to very different vector fields and thus different training dynamics.
  2. ODE solver and backprop sensitivity
mrpositron commented 3 years ago

Thanks for your reply!

I did not specify abot my model choise. I am sorry. My model consists of GRU encoder and NODE decoder. (i.e. RNN-NODE). Here is a chunk of my code that defines Neural ODE:

self.ode_solve = NeuralDE(self.func, sensitivity = 'autograd', solver = 'rk4', s_span = self.span)

During forward pass I get values from the trajectory obtained from the self.ode_solve (trajectory = self.ode_solve.trajectory(hidden, self.span)) and pass it to fc layer.