Closed rgtjf closed 6 years ago
Thanks for noticing this! There was indeed an issue here. I proposed a fix in ea305ea875639ab6b191aad4337f5f86e1465fe8 . Feel free to re-open the issue if you still have problems with this evaluation.
Great! Thank you
Thanks for this wonderful project!
I found I can not evaluate on cross-lingual word similarity task (i.e., SEMEVAL17 task).
in
get_evaluation.sh
, the eval data arecrosslingual/wordsim/$lg_pair-SEMEVAL17.txt
https://github.com/facebookresearch/MUSE/blob/26e3e4022102a95ec0f94e89f91089d5d3131fca/data/get_evaluation.sh#L93in
src/evaluation/wordsim.py
, the eval file look like$lg_pair/SEMEVAL17.txt
https://github.com/facebookresearch/MUSE/blob/26e3e4022102a95ec0f94e89f91089d5d3131fca/src/evaluation/wordsim.py#L204-L218