Open Atousa-Kalantari opened 1 month ago
Tagging the author of the example for more info: @ntakouris , could you please have a look into this.
I've been asking the same question!
Hi,
I’m still waiting for a response on my question about the lack of positional encoding in the timeseries_classification_transformer.py example. Can someone clarify why it was omitted and its impact on model performance?
Thanks!
Issue Type
Bug
Source
source
Keras Version
Keras 2.14
Custom Code
Yes
OS Platform and Distribution
No response
Python version
No response
GPU model and memory
No response
Current Behavior?
I noticed that positional encoding is not used in the timeseries_classification_transformer.py example. Given the importance of sequence order in time series data, why was this omitted? Does this impact the model's effectiveness for time series classification? I'd appreciate any insights on this design choice. Thank you.
Standalone code to reproduce the issue or tutorial link
Relevant log output
No response