cchallu / n-hits

169 stars 25 forks source link

I can't see the model's detail in the code #11

Open signalworker123 opened 1 year ago

signalworker123 commented 1 year ago

I want see the model's detail in the code,but i found the Pytorch Lightning in the pycharm can't debug, they just run,how can i see the training data flows in the model? And it will makes me understand the model better. Thank you.

kdgutier commented 1 year ago

Hi @signalworker123,

N-HiTS is based on deep stacked MLPs with doubly residual connections for the forecast and backast signals. The main contributions to the architecture are the input pooling and the use of hierarchical interpolation for the multi-step predictions strategy. See Figure below:

Screenshot 2022-11-04 at 8 31 45 AM
kdgutier commented 1 year ago

Regarding data flows you might want to try: pip install torchinfo

How is Pycharm (the IDE) related to visualization of data flows of a Pytorch model? Would you be able to share the bug?

signalworker123 commented 1 year ago

Thank you very much ! I just a little confuse about the hierarchical interpolation , I want to see the detail to hierarchical interpolation in the code ,but when I use Pycharm debug in the experiment/utils.model_fit_predict
that is trainer = pl.Trainer(max_epochs=mc['max_epochs'], max_steps=mc['max_steps'], check_val_every_n_epoch=mc['eval_freq'], progress_bar_refresh_rate=1, gpus=gpus, callbacks=callbacks, checkpoint_callback=False, logger=False) trainer.fit(model, train_loader, val_loader)

the trainer.fit can't debug in ,so I can't see the training data how to flow in the model . Actualy there is no bug in my code , but I jusy want to have a sensibility feeling about the hierarchical interpolation.

kdgutier commented 1 year ago

Interpolation code is performed in here, where we take thetas and recover forecast size. While hierarchical interpolation is achieved by reducing the coefficient's theta expressivity.

This plot captures the intuition of hierarchical interpolation, and the induced layers frequency specialization (and the time locality beyond Fourier transform):

Screenshot 2022-11-04 at 8 48 41 AM

If you are interested in the theory behind it, we found very cool connections between the hierarchical interpolation mechanism and wavelet transforms. They are available in the neural basis expansion theorem.

kdgutier commented 1 year ago

By the way @signalworker123,

We are focusing now our efforts in the NeuralForecast library where we already host NHITS along with other cool SoTA algorithms.

signalworker123 commented 1 year ago

Thank you again, I think I should learn more about Pytorch Lighting and read you paper more carefully.I just try informer before but it doesn't work well. I just notice you mentioned library when I chatting with you , thank you . I well try the Demo in the library.

kdgutier commented 1 year ago

Here is the NeuralForecast NHITS example and documentation.