Closed leoamb closed 2 years ago
Hi! Sorry for that. Seems that the hyper-parameters provided in our script is not correct. You can try the following hyper-parameters to reproduce the result
python3 train.py \
--task lp \
--dataset airport \
--model HyboNet \
--lr 0.05 \
--dim 16 \
--num-layers 2 \
--bias 1 \
--dropout 0 \
--weight-decay 1e-3 \
--manifold Lorentz \
--log-freq 5 \
--cuda 0 \
--patience 500 \
--grad-clip 0.1 \
--seed 1234
The final output on my own machine (V100, PyTorch=11.0, Cuda=11.3, geoopt=0.4.1) is
INFO:root:Val set results: val_loss: 2.2162 val_roc: 0.9738 val_ap: 0.9582
INFO:root:Test set results: test_loss: 11.5684 test_roc: 0.9736 test_ap: 0.9634
I've also uploaded the training log (with --log-freq 1) at https://drive.google.com/file/d/1YoVKXjScFnMyowvI_pIjMR1eEbNaUq9_/view?usp=sharing
Closing this issue for no further update
Hi,
Thank you for your great work. I tried to reproduce the results on the airport lp dataset and I get 96.28 on the test dataset not 97.3 as reported in the paper. I used the same configuration parameters as you mentioned here in this repository.