CarloLucibello / GraphNeuralNetworks.jl

Graph Neural Networks in Julia
https://carlolucibello.github.io/GraphNeuralNetworks.jl/dev/
MIT License
214 stars 47 forks source link

Dropout inside GATConv layer #258

Closed afternone closed 2 weeks ago

afternone commented 1 year ago

In the (Optional) Exercises, it is mentioned dropout inside GATConv layer, but I did not find the keywords parameter about dropout. How can I set dropout inside GATConv layer.

CarloLucibello commented 1 year ago

Actually, that is a missing feature. We should add the dropout option to GATConv and GATv2Conv PyG has it

5hv5hvnk commented 1 year ago

@CarloLucibello Can adding something like this after line 366 here can do the job?

if l.dropout > 0 && Flux.training()
        x = Flux.dropout(x, l.dropout)
CarloLucibello commented 1 year ago

not really, dropout is performed on the attention coefficients I think. It corresponds to masking the attention, see https://github.com/FluxML/NNlib.jl/blob/master/src/attention.jl