-
Hi,
I just want to quickly inquire about Midas depth prediction. The original midas approach seems to predict disparity or inverse depth, rather than euclidean depth: https://github.com/isl-org/MiD…
b0ku1 updated
2 years ago
-
Hello,
I'm encountering some issues when running iTransformer on the PEMS08 dataset. Here are the details:
- Dataset: PEMS08
- Input sequence length: 96
- Prediction length: 12
- Configuratio…
-
Create new branch (`inverse-prediction`) and implement the following functions:
* `ipredict` generic and methods (default, opticut): currently called `calibrate`, but `ipredict` should better refle…
-
With questions like #29163 and with the private loss functions #15123 (almost everywhere) in place, I would like to discuss to make the inverse link function public.
Models like LogisticRegression …
-
In the Midas loss, why calculate the inversion of the `prediction` and `target` before the `reg_loss`? I didn't find a corresponding explanation in the Midas paper.
``` python
class MidasLoss(nn.Mo…
-
Train a separate model specifically for the task of inverse modeling, where the goal is to infer the previous state and rule from a given state or sequence of states. This model would essentially lear…
-
- PyTorch-Forecasting version: 0.10.2
- PyTorch version: 1.11.0
- Python version: 3.9.12
- Operating System: Windows 7
### Expected behavior
I executed code like in the tutorial [Demand fore…
-
Thanks for sharing, I have a question about the parameters, what does the Initial parameter refer to in this image
![321282217-9a992852-f7e6-4ca1-b9fc-a0a993c7fe81](https://github.com/user-attachment…
-
In some cases transforming the target will improve the prediction drastically for regression problems (see: https://scikit-learn.org/stable/auto_examples/compose/plot_transformed_target.html)
Can I…
-
Pretty good model! There are two models available (512-96 and 1024-96). The predict_len is set to 96, can we change it?