yuqinie98 / PatchTST

An offical implementation of PatchTST: "A Time Series is Worth 64 Words: Long-term Forecasting with Transformers." (ICLR 2023) https://arxiv.org/abs/2211.14730
Apache License 2.0
1.37k stars 248 forks source link

run patchtst_finetune.py error #18

Closed shushan2017 closed 1 year ago

shushan2017 commented 1 year ago

args: Namespace(is_finetune=0, is_linear_probe=0, dset_finetune='exchange', context_points=512, target_points=96, batch_size=64, num_workers=0, scaler='standard', features='M', patch_len=12, stride=12, revin=1, n_layers=3, n_heads=16, d_model=128, d_ff=256, dropout=0.2, head_dropout=0.2, n_epochs_finetune=20, lr=0.0001, pretrained_model='./p_model.pth', finetuned_model_id=1, model_type='based_model') weight_path= saved_models/exchange/masked_patchtst/based_model/exchange_patchtst_finetuned_cw512_tw96_patch12_stride12_epochs-finetune20_model1 number of patches: 42 number of model params 920672 Traceback (most recent call last):

File "D:\pyyj\PatchTST-main\PatchTST_self_supervised\patchtst_finetune.py", line 235, in out = test_func(weight_path)

File "D:\pyyj\PatchTST-main\PatchTST_self_supervised\patchtst_finetune.py", line 200, in test_func out = learn.test(dls.test, weight_path=weight_path+'.pth', scores=[mse,mae]) # out: a list of [pred, targ, score]

File "D:\pyyj\PatchTST-main\PatchTST_self_supervised\src\learner.py", line 258, in test if weight_path is not None: self.load(weight_path)

File "D:\pyyj\PatchTST-main\PatchTST_self_supervised\src\learner.py", line 387, in load load_model(fname, self.model, self.opt, with_opt, device=device, strict=strict)

File "D:\pyyj\PatchTST-main\PatchTST_self_supervised\src\learner.py", line 429, in load_model state = torch.load(path, map_location=device)

File "C:\anaconda3\lib\site-packages\torch\serialization.py", line 771, in load with _open_file_like(f, 'rb') as opened_file:

File "C:\anaconda3\lib\site-packages\torch\serialization.py", line 270, in _open_file_like return _open_file(name_or_buffer, mode)

File "C:\anaconda3\lib\site-packages\torch\serialization.py", line 251, in init super(_open_file, self).init(open(name, mode))

FileNotFoundError: [Errno 2] No such file or directory: 'saved_models/exchange/masked_patchtst/based_model/exchange_patchtst_finetuned_cw512_tw96_patch12_stride12_epochs-finetune20_model1.pth'

ikvision commented 1 year ago

patchtst_finetune should be run after patchtst_pretrain according to https://github.com/yuqinie98/PatchTST/blob/main/README.md#self-supervised-learning-1 Where did you save the pre-trained model?

namctin commented 1 year ago

Yes, we need to run pretraining model first to save the weights, and then use the trained weights for finetuning. Thanks @ikvision for explaining that.