Closed AarSeBail closed 10 months ago
Thanks for taking this, it seems mostly fine, added a few suggestions.
Would it be worth exposing these functions?
I don't think they are worth exposing at the moment
I was expanding the test case (I'm glad I did, I believe I uncovered an issue with cat_features), and I came across something that seems off. The syntax g.ndata[:nodetype][:x] = value
seems broken, unless I'm mistaken. I think the issue is that in datastore.jl, Base.setindex! is defined as
Base.setindex!(ds::DataStore, s::Symbol, x) = setproperty!(ds, s, x)
I think the order is incorrect, and should be
Base.setindex!(ds::DataStore, x, s::Symbol) = setproperty!(ds, s, x)
Is this intentional and I'm just using it incorrectly?
I believe that everything here is in working order now. I modified cat_features(xs::AbstractVector{<:Dict})
to be less strict, and the current tests do still pass. I am wondering whether it would be worthwhile updating the other cat_features implementations which might pose problems for heterogeneous graphs, such as cat_features(x1::Dict{Symbol, T}, x2::Dict{Symbol, T})
. In particular the ones which throw @error "cannot concatenate feature data with different keys"
. If so, I would not be opposed to drafting a PR.
I kept the previous implementation of Base.setindex!(ds::DataStore, s::Symbol, x)
, but added an implementation of Base.setindex!(ds::DataStore, x, s::Symbol)
to make the syntax work as I believe it was intended.
I also finalized the new test case.
i can't believe that setindex!
bug was not covered by tests! Can you add an explicit one in test/GNNGraphs/datastore.jl
?
I am wondering whether it would be worthwhile updating the other cat_features implementations which might pose problems for heterogeneous graphs, such as cat_features(x1::Dict{Symbol, T}, x2::Dict{Symbol, T}). In particular the ones which throw @error "cannot concatenate feature data with different keys". If so, I would not be opposed to drafting a PR.
it could be worth doing
I have added the test that you requested and committed the suggestions. I'm not sure why it did not pass the test on CI earlier, since it was passing locally. Assuming that it is working now, I think this is near completion.
I removed the aforementioned type constraints by generalizing the Flux.batch implementation. First and foremost, the implementation may not be correct, despite passing tests, so that needs to be checked.
It's not the cleanest patch so it definitely needs some revisions at the very least. In particular, instead of edge_index I use a nullable version, and the same with get_edge_weight. Would it be worth exposing these functions? Also, I am sure that there are inefficiencies in the patch.
The test case is incomplete, so it needs to be fleshed out correctly.
Resolves https://github.com/CarloLucibello/GraphNeuralNetworks.jl/issues/341