Open hoon0528 opened 3 years ago
Hi I have maybe the same problem. I am looking for a way to predict new unseen trajectory steps but seems that something is missing.
Briefly, given x1, x2, ... xT with estimated edges and I would like to predict xT+1, ...xT+N.
As in eq 12, the x(t+1) follows a normal distribution with mean mu(t+1), did you decided to not draw the value of x(t+1) from this distribution ? Indeed we guess that the most probable value is mu(t+1)... so did you set x(t+1) = mu(t+1) ? In this case equation 12 can be skipped.
Concerning the prediction on unseen trajectories, do we have to call the MLPDecoder many times to iteratively predict all x(t+k) value using x(t).... from my point of its seems that it is the only solution. The idea here is to reproduce the non transparent lines on Figure 9.
Thank a lot for helping.
Hi, @djoad001 how did you end up doing? I also don't see how to predict iteratively different timesteps when I don't have access to the ground truth (and thus can not do the teacher forcing every N timesteps). So I guess the solution is to modify the code to take into account that we are in test mode and use the last mu (or draw a sample from the distribution) to predict the next one, all of it iteratively until we get the desired number of timesteps ?
Thanks for your help.
Below is the code snippet of MLPDecoder. I think prediction is ended with Eq. 11 in the paper. I can't find the code of Eq. 12. Am I missing something in this code??
Thanks in advance.