pyg-team / pytorch_geometric

Graph Neural Network Library for PyTorch
https://pyg.org
MIT License
20.56k stars 3.57k forks source link

`return_attention_weights` when set to False returns attention weights in GATv2Conv #9319

Open kar655 opened 1 month ago

kar655 commented 1 month ago

🐛 Describe the bug

When setting optional argument return_attention_weights in GATv2Conv's forward to False it returns the attention weights. I believe it doesn't check parameter's value only if it not None.


        if isinstance(return_attention_weights, bool):
            if isinstance(edge_index, Tensor):
                if is_torch_sparse_tensor(edge_index):
                    # TODO TorchScript requires to return a tuple
                    adj = set_sparse_value(edge_index, alpha)
                    return out, (adj, alpha)
                else:
                    return out, (edge_index, alpha)
            elif isinstance(edge_index, SparseTensor):
                return out, edge_index.set_value(alpha, layout='coo')
        else:
            return out

My guess is to change return_attention_weights: Optional[bool] = None to just return_attention_weights: bool = False and appropriate if.

By running rg "isinstance(.*, bool)" -C 5 in repo other files might have similar issue:

Let me know if it's a bug, or I don't get something. Thanks

Versions

I'm using torch-geometric 2.4.0 but in the newest repo it still occurs.

rusty1s commented 1 month ago

Yes, this is an expected hack due to TorchScript. We cannot yet change the return type based on values of boolean types, so we opted to condition the return type based on return_attention_weights is None vs return_attention_weights is bool.