keras-team / keras-io

Keras documentation, hosted live at keras.io
Apache License 2.0
2.73k stars 2.03k forks source link

Question about lack of positional encoding in timeseries_classification_transformer.py #1894

Open Atousa-Kalantari opened 1 month ago

Atousa-Kalantari commented 1 month ago

Issue Type

Bug

Source

source

Keras Version

Keras 2.14

Custom Code

Yes

OS Platform and Distribution

No response

Python version

No response

GPU model and memory

No response

Current Behavior?

I noticed that positional encoding is not used in the timeseries_classification_transformer.py example. Given the importance of sequence order in time series data, why was this omitted? Does this impact the model's effectiveness for time series classification? I'd appreciate any insights on this design choice. Thank you.

Standalone code to reproduce the issue or tutorial link

https://github.com/keras-team/keras-io/blob/master/examples/timeseries/timeseries_classification_transformer.py

Relevant log output

No response

sachinprasadhs commented 1 month ago

Tagging the author of the example for more info: @ntakouris , could you please have a look into this.

arun-nemani commented 1 month ago

I've been asking the same question!

Atousa-Kalantari commented 5 days ago

Hi,

I’m still waiting for a response on my question about the lack of positional encoding in the timeseries_classification_transformer.py example. Can someone clarify why it was omitted and its impact on model performance?

Thanks!