JunqiAug / DwHGCN

6 stars 0 forks source link

Is this official implementation of "Dynamic weighted hypergraph convolutional network for brain functional connectome analysis?" #1

Open barulalithb opened 1 year ago

barulalithb commented 1 year ago

Dear Junqi,

Is this code an official implementation of a paper titled "Dynamic weighted hypergraph convolutional network for brain functional connectome analysis?". Could you please confirm your codebase regarding this, this seems a little suspicious as there are no hyperedge implementations.

Thanks Lalith

JunqiAug commented 1 year ago

Thank you for raising the concern. In our framework, we optimize the hypergraph by updating the hyperedge weights instead of reconstructing the hyperedges. We will assign larger weights to those hyperedges with higher predictive power while shrinking the weights with lower predictive power, regulated by the manifold loss.

The construction of the hypergraph is not our innovative part so I did not put how to construct the initial hypergraph here. There are multiple approaches to initialize the hypergraph.

mehular0ra commented 1 year ago

Dear Junqi,

Thanks for clarifying. Could you direct me to the specific section of the code where hyperedge weights are assigned? I'm trying to understand the initialization and weighting process better for reproducibility purposes.

Thanks Mehul

JunqiAug commented 1 year ago
    self.weight_lap = nn.Parameter(torch.Tensor(config.weight_dim))
    self.reset_parameters()

def reset_parameters(self):
    # torch.nn.init.kaiming_uniform_(self.weight_lap)
    torch.nn.init.uniform_(self.weight_lap)
    # torch.nn.init.constant_(self.weight_lap, 1/config.weight_dim)

Sorry for the late response. The weight_lap is the weight for the hypergraph Laplacian matrix, which will be updated during training. I tried different initialization methods for this weight.