LunaBlack / KGAT-pytorch

324 stars 73 forks source link

About unpretrained training #6

Open yankai-chen opened 4 years ago

yankai-chen commented 4 years ago

Thanks for your implementation. I have run your codes in both pretrained and unpretrained mode. I have a question that do you have tried the unpretrained mode? It seems that the results are extremely bad. The pretrained version is ok and the results of early epochs are close to those of the authors' TF version. But the unpretrained one is far worse than the unpretrained TF version.

In my machine, I got an early stop at epoch 20, with the evaluation metrics as follows:

2020-08-26 16:44:12,377 - root - INFO - CF Evaluation: Epoch 0020 | Total Time 148.6s | Precision-k 0.0002970381 Recall-k 0.0003790736 NDCG-k 0.0003935562

Would you please suggest any reasons for these? Could it mainly because of the early stop part? Any suggestions will be much appreciated and look forward to your reply! Thank you!