lucidrains / iTransformer

Unofficial implementation of iTransformer - SOTA Time Series Forecasting using Attention networks, out of Tsinghua / Ant group
MIT License
413 stars 35 forks source link

Multiclass prediction #28

Open TomKoester opened 2 months ago

TomKoester commented 2 months ago

Hey, is it possible to do multi class prediction with this?

my data has the following format: Timestamp Feature 1 ... Feature X Target 1 .. Target X

Can you give me a short example how the iTransformer knows what columns the Targetfeatures are?

Thanks

lucidrains commented 2 months ago

@TomKoester does it even work for single class predictions? happy to build it if you can show me your results

TomKoester commented 2 months ago

image Here is my output. I ve used a SNGP layer with a threshold to model the certainty for my different classes. I just tried it with generated dummy data and gave it into the model:

image

TomKoester commented 2 months ago

i forked my repo from the original thuml repo. Is it even possible to use the training classes for the multilabel classification? Or do you think im better with implementing this by myself?

TomKoester commented 2 months ago

I've read the paper again. Did i understand it right that the iTransformer doesnt care about the input/target features as it predicts ALL features?

TomKoester commented 2 months ago

Hey @lucidrains , its me again :) i think i can modify the training process when i adjust the f_dim parameter in the exp_long_term_forecasting.py: image

whats your thought on that? it would ignore all predicted input features in training and just evaluate on the f_dim numbers of targets.

satyrmipt commented 1 week ago

@TomKoester, there are no features and targets. Inputs are separate time series and outputs are predictions for each one of them.