DC-research / TEMPO

The official code for "TEMPO: Prompt-based Generative Pre-trained Transformer for Time Series Forecasting (ICLR 2024)". TEMPO is one of the very first open source Time Series Foundation Models for forecasting task v1.0 version.
61 stars 8 forks source link

Hyperparameters setting? #1

Closed ztb-35 closed 3 months ago

ztb-35 commented 4 months ago

Dear authors:

Thanks for your great work and simple but efficient code. But I didn't find your hyperparameters setting in your code or paper. Could you give me a list of hyperparameters setting for gpt2? I am really interested in your brilliant work and want to reproduce it.

Thanks!

idevede commented 3 months ago

Dear ztb-35,

Thank you for your kind words and interest in our work! 😁

The hyperparameter settings are located in the ‘scripts’ folder. FYI, we are currently working on updating the demo for more applications, which should be available soon.