YuxiangRen / Heterogeneous-Deep-Graph-Infomax

HDGI code
59 stars 14 forks source link

Dropout and ratio #5

Closed liun-online closed 4 years ago

liun-online commented 4 years ago

Hi, thanks your job! When I m trying your code, I dont find anywhere using the dropout operation. So, why you set the drop_prob parameter? Meanwhile, in your paper, you conducted two different training-ratio(20% and 80%), and you show the case of 20%. So ,when it comes to 80%, how can i split the dataset? 10% for validation and 10% for test? i hope for your response! Thank U very much!

YuxiangRen commented 4 years ago

The test and validation ratio are fixed as 10%, and only the training ratio changes in 20% and 80%.

In our initial version, we used dropout in the attention layer. But later in the experiment found that the effect of dropout is not obvious, we commented out the operation of dropout. However, we forgot to delete the dropout parameter in execute.py, which is actually not used.

If you have other questions, just let me know.

liun-online commented 4 years ago

Thank you for responding me in a timely way!