About Code release for "Anomaly Transformer: Time Series Anomaly Detection with Association Discrepancy" (ICLR 2022 Spotlight), https://openreview.net/forum?id=LzQQ89U1qm_
MIT License
733
stars
190
forks
source link
Optimize nested loop efficiency in the Anomaly Attention class #63
Refactored the existing nested loop structure within the Anomaly Attention class used for finding the distance between the indexes. Replaced the original dual for-loops, which scaled with window size, with a more efficient implementation.
Performance Improvement:
Original execution time for a window size of 250: ~50 seconds.
New execution time for the same window size: ~0.01 seconds.
I was experimenting with different window sizes to see their effects on my dataset. However, in my current setup where I use a high-performance workstation, it started to take too much time when I increased the window size. This new approach takes much more lower time to achieve the same matrix.
Refactored the existing nested loop structure within the Anomaly Attention class used for finding the distance between the indexes. Replaced the original dual for-loops, which scaled with window size, with a more efficient implementation.
Performance Improvement:
I was experimenting with different window sizes to see their effects on my dataset. However, in my current setup where I use a high-performance workstation, it started to take too much time when I increased the window size. This new approach takes much more lower time to achieve the same matrix.