iMoonLab / HGNN

Hypergraph Neural Networks (AAAI 2019)
MIT License
661 stars 128 forks source link

As regards the regularizer on hypergraph #17

Closed 0x10C closed 1 year ago

0x10C commented 1 year ago

As written in paper , \Omega(f) is a regularize on hypergraph. I reviewed your implement code, only found emprirical loss in your objective function. Is it necessary add this regularizer into loss function? I 'd appreciate if you can answer ASAP, thanks !

yifanfeng97 commented 1 year ago

Hi, the \Omega(f) is defined in traditional hypergraph learning. In our HGNN, we remove the \Omega(f) in the loss function, and it is transformed as the hypergraph convolution.

yifanfeng97 commented 1 year ago

More details of the hypergraph neural network can be found in https://github.com/iMoonLab/DeepHypergraph/ .

0x10C commented 1 year ago

Got it. thx