Closed saharghannay closed 3 years ago
You can fix the random seeds for python, numpy, torch, and cuda to reduce the variance across different runs.
Ok thank you
Best Sahar Ghannay
Le 18 mars 2020 à 17:32, Max Ma notifications@github.com a écrit :
You can fix the random seeds for python, numpy, torch, and cuda to reduce the variance across different runs.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/XuezheMax/NeuroNLP2/issues/40#issuecomment-600733627, or unsubscribe https://github.com/notifications/unsubscribe-auth/AJ5IJQKY335E2WB27HY4WSDRIDZSPANCNFSM4LOR2BVQ.
I am training a NER system, but the results change from a run to an other using the same configuration. Do you have an idea how to fix this variability please ?