ML4ITS / mtad-gat-pytorch

PyTorch implementation of MTAD-GAT (Multivariate Time-Series Anomaly Detection via Graph Attention Networks) by Zhao et. al (2020, https://arxiv.org/abs/2009.02040).
MIT License
328 stars 76 forks source link

about gat_layer #30

Open adverbial03 opened 1 year ago

adverbial03 commented 1 year ago

Thank you for your excellent work. I don’t understand something about the gat layer. Your graph_attention is implemented with the function make_attention_input, but it seems that you just copied and spliced x(v) in various ways. I can’t understand how this part implements graph_attention. Can you explain it in detail? ? In addition, if I want to build a graph that is not fully connected (each node has a fixed number of edges), is this possible?