Closed 0x10C closed 1 year ago
Hi, the \Omega(f) is defined in traditional hypergraph learning. In our HGNN, we remove the \Omega(f) in the loss function, and it is transformed as the hypergraph convolution.
More details of the hypergraph neural network can be found in https://github.com/iMoonLab/DeepHypergraph/ .
Got it. thx
As written in paper , \Omega(f) is a regularize on hypergraph. I reviewed your implement code, only found emprirical loss in your objective function. Is it necessary add this regularizer into loss function? I 'd appreciate if you can answer ASAP, thanks !