Open mate-huaboy opened 1 year ago
Hi,The aforementioned results were obtained on the "correct_symmetry_ids" branch. When I conducted the same test on the "master" branch, I observed that the outcomes closely resembled those mentioned in your paper: the program yielded the following output, TEST ENDING: add_loss:0.0265 add:0.0545 gadd_loss:0.0000 gadd:0.0000 add_auc:93.4679 gadd_auc:81.9997.
However, I have an additional inquiry: Would it be possible to publicly share the evaluation code for the metrics mentioned in other papers? Such a provision would greatly facilitate our ability to make more accurate comparisons.Thanks
We did not compare with other metrics except the ADD(S), this benchmark may provide the related code for evaluation on other metrics.
Hi,thanks for your great work! I have some problem when I run your test code. Firstly, I followed your instructions to organize the relevant data. Then, using the model parameters and code provided by you, I evaluated it on the Tless dataset. In the end, I found that the program provided an ADD score of 76.3 and a GADD score of 10.14, which don't seem to correspond to the values mentioned in the paper. I'm not sure where I went wrong. Here is a partial output from the program.
I don't fully understand what the
add_loss, add, gadd_loss, gadd, add_auc, and gadd_auc
represent in the final program output. Could you please explain them in detail? Additionally, your paper also mentions other metrics like VSD. Does the open-source code include evaluation code for these metrics as well? Thank you very much for your response.