maxjcohen / transformer

Implementation of Transformer model (originally from Attention is All You Need) applied to Time Series.
https://timeseriestransformer.readthedocs.io/en/latest/
GNU General Public License v3.0
842 stars 165 forks source link

How to set Positional encoding #59

Closed flydephone closed 1 year ago

flydephone commented 1 year ago

I have a time-series dataset with day-scale resolution. When I set the pe = 'regular' # Positional encoding, Then I will get an error TypeError: generate_original_PE() got an unexpected keyword argument 'period So I can only use the regular PE, but I don't know how to set the pe_period. It seems that a pe_period = 24 are for dataset with Hour-scale resolution.

maxjcohen commented 1 year ago

Hi, thanks for reporting this issue, it should be fixed with 997b08e18fc06452a505d14c39f33294b4d9e65e.

flydephone commented 1 year ago

Thanks. By the way, I still want to know how to set the pe_period when I use the regular PE.

maxjcohen commented 1 year ago

I haven't gotten around to code this setting, fell free to send a PR if you've got the time to write it yourself. The function is defined here: https://github.com/maxjcohen/transformer/blob/997b08e18fc06452a505d14c39f33294b4d9e65e/tst/utils.py#L7-L29

flydephone commented 1 year ago

OK, I will figure out it.