Open Chen-Cai-OSU opened 3 years ago
Hi! I did not test them one by one. I took the results from "Jason Hartford, Devon Graham, Kevin Leyton-Brown, and Siamak Ravanbakhsh. Deep models of interactions across sets. In International Conference on Machine Learning, pp. 1909–1918, 2018.".
Hello,
Thank you for the quick response. I am interested in using your method but my concern is that my data is extremely sparse: the density is usually around 0.01%-0.03%. (there are very few common items two user co-rate) I am worried that the subgraph topology will be fairly simple due to the extreme sparsity.
I saw in your paper, there is indeed some encouraging result in 5.3 regarding the very sparse case. (so my understnading is that for the 0.001 case, the resulting density for ML-1M is 0.001*0.0447=0.00447%). Do u have some suggestions? Thank you!
Another question I had is that the subgraph for each (u, v) is not necessarily connected, right? How do you handle it when each subgraph has two connected components? Thank you!
For the first question, yes, resulting density is 0.00447%. IGMC shows higher robustness to sparsity than transductive matrix factorization methods.
For the second one, yes. The subgraph around (u, v) need not be connected. This is a strong evidence that u might not have an interest in v. Thus, I did not handle this case at all, and just let the GNN to learn from the disconnected subgraph.
Hello,
Thank you very much the nice paper and code. I was wondering when you compare the IGMC with other methods, how do you replicate the results from previous work? Is there any good library to try them all easily, or you have to test it one by one? Thank you!