This is our implementation of ENMF: Efficient Neural Matrix Factorization (TOIS. 38, 2020). This also provides a fair evaluation of existing state-of-the-art recommendation models.
MIT License
149
stars
28
forks
source link
The nDCG formula may not consistent between your paper and implementation? #7
Hello, I read your paper(TOIS) and your implementation.
I have a question about your implementation of the formula of nDCG.
In your paper, you write the formula of nDCG using log2 (logarithmic base is 2), but in your implementation I think you use np.log (logarithmic base is natural logarithm). (c.g., L209 or L303 in code/ENMF.py)
Should you use np.log2 in your implementation for consistency?
Anyway, thank you for putting together a good paper and implementation!
Hello, I read your paper(TOIS) and your implementation. I have a question about your implementation of the formula of nDCG.
In your paper, you write the formula of nDCG using log2 (logarithmic base is 2), but in your implementation I think you use np.log (logarithmic base is natural logarithm). (c.g., L209 or L303 in code/ENMF.py)
Should you use np.log2 in your implementation for consistency?
Anyway, thank you for putting together a good paper and implementation!