danielegrattarola / keras-gat

Keras implementation of the graph attention networks (GAT) by Veličković et al. (2017; https://arxiv.org/abs/1710.10903)
MIT License
474 stars 123 forks source link

遇到这个问题怎么解决啊???求大佬解答!! #33

Open cwl1999 opened 3 years ago

cwl1999 commented 3 years ago

keep_dims is deprecated, use keepdims instead Traceback (most recent call last): File "D:/Gra_stu/Pratice/keras-gat-master/examples/gat.py", line 36, in H = GraphAttention(8, attn_heads=8, attn_heads_reduction='concat', dropout_rate=0.6, activation='elu', kernel_regularizer=l2(5e-4), attn_kernel_regularizer=l2(5e-4))([H]+G) File "D:\Downloads\Anaconda\envs\keras-gcn\lib\site-packages\keras\engine\topology.py", line 603, in call output = self.call(inputs, *kwargs) File "D:\Gra_stu\Pratice\keras-gat-master\keras_gat\graph_attention_layer.py", line 119, in call mask = -10e9 (1.0 - A) TypeError: unsupported operand type(s) for -: 'float' and 'list'

对应graph_attention_layer.py文件中这一行:mask = -10e9 * (1.0 - A)

danielegrattarola commented 3 years ago

Not too sure what's going on here, but you should check the inputs to your layers. A should be a tensor-like object, not a list.

Cheers