KimMeen / Time-LLM

[ICLR 2024] Official implementation of " 🦙 Time-LLM: Time Series Forecasting by Reprogramming Large Language Models"
https://arxiv.org/abs/2310.01728
Apache License 2.0
1.29k stars 221 forks source link

More question about Issue 104 #113

Closed stjack01 closed 2 months ago

stjack01 commented 3 months ago

这个问题是接续 issue-104 关于结果的缩放的讨论

@kwuking 想请教下你说的 ”如果想要inverse回去只需要进行对数据进行reshape即可。。。“这句话的意思是,调用dataloader.py里面的scale执行inverse_transform ? 把outputs 和batch_y 进行reshape以后调用inverse_transform ?可否详细解释一下呢?

def vali(args, accelerator, model, vali_data, vali_loader, criterion, mae_metric): ...... outputs = outputs[:, -args.pred_len:, f_dim:] batch_y = batch_y[:, -args.pred_len:, f_dim:].to(accelerator.device)

    pred = outputs.detach()
    true = batch_y.detach()

    loss = criterion(pred, true)
    mae_loss = mae_metric(pred, true)
kwuking commented 3 months ago

是的,您的理解是正确的,如果您需要将数据scale恢复回原始尺度,您只需要对输出的数据进行reshape操作,使得他的shape和最初的进行scale时保持一致即可,最后采用inverse_transform即可恢复原始数据的尺度,再次感谢您对于我们工作的关注。