bnusss / GGN

Gumbel Graph Network (GGN) : A General Deep Learning Framework for Network Reconstruction
68 stars 21 forks source link

Trying to reproduce your results #4

Open julianzimmerlin opened 4 years ago

julianzimmerlin commented 4 years ago

Hello, I am trying to reproduce your results for the Boolean Network from the paper. However, I am currently unable to get the same results as you. I have a couple of questions:

  1. In your paper your mention that the training parameteres S_n and S_d for this experiment are 20 and 10, respectively. However, the default configuration in the code is S_n=10 and S_d=20 (so the numbers are swapped). Which are the correct values that you used for this experiment?
  2. For how many epochs did you train your network?
  3. Is it intended that the BN dataset generator produces datasets of different size in repeated runs?

Thanks in advance.

3riccc commented 4 years ago

Hi, thank you for reproducing our result. I'm Zhang, the first author of the paper.

  1. The 'generator' you mean is the data generator to generate the dataset or the network generator during training? In the former cases, in repeat runs, different samples in BN dataset have the same network structure, the differences are initial conditions. If you mean the later, in different epochs, the network generator will generate an adjacency matrix with the size fixed at [node num, node num].

As for the 1. and 2. questions, I will check the code and reply soon.

Thank you for your attention. Any other issues are welcomed to ask ^_^

julianzimmerlin commented 4 years ago

Thank you for the quick reply. I meant the data generator to generate the dataset ('data_generator_bn.py'). To be more precise, I am confused about the number of samples in the dataset. The paper says for the BN network experiment:

The training data we generated contains 5k pairs of state transition sequences. Meanwhile, we simulated 1k validation set and 1k test set.

But when I run the 'data_generator_bn.py' script for 10 nodes, I don't get a dataset with 7k pairs, instead it seems to be random. Sometimes the resulting tensor has shape (2048x10x1), sometimes (4544x10x1), sometimes other numbers in the first dimension. From the paper I would have expected the shape (14000x10x1).