gzerveas / mvts_transformer

Multivariate Time Series Transformer, public version
MIT License
718 stars 169 forks source link

How training samples are used for unsupervised pretraining and fine tuning the model #61

Open hungryGeek16 opened 8 months ago

hungryGeek16 commented 8 months ago

@gzerveas

# Pretraining
python src/main.py --output_dir experiments --comment "pretraining through imputation" --name FaceDetection_pretrained --records_file Imputation_records.xls --data_dir dataset --data_class tsra --pattern TRAIN --val_ratio 0.2 --epochs 700 --lr 0.001 --optimizer RAdam --batch_size 128 --pos_encoding learnable --d_model 128 --dim_feedforward 256 --num_head 8 --num_layers 3

# Fine-tuning
!python src/main.py --output_dir experiments --comment "finetune for classification" --name finetuned --records_file Classification_records.xls --data_dir dataset --data_class tsra --load_model experiments/FaceDetection_pretrained_2023-11-06_19-06-24_MVw/checkpoints/model_last.pth --pattern TRAIN --val_pattern TEST --batch_size 128 --epochs 100 --pos_encoding learnable --d_model 128 --dim_feedforward 256 --num_head 8 --num_layers 3 --task classification --change_output --key_metric accuracy