jiaweizzhao / InRank

MIT License
212 stars 13 forks source link

Somewhat Higher PPL Running on 4*A10 #1

Open Frederick666666 opened 9 months ago

Frederick666666 commented 9 months ago

Hi, Thank you for your work. When running wiki103 gpt2 baseline and corresponding InRank pretraining experiments on 4*A10, the final evaluation performance of PPL is a bit higher. Does the running machine affect the running results?

jiaweizzhao commented 9 months ago

It's possible that the underlying machine and precision affects the results. Can you share with us the details of your setup and scripts? We would be happy to help further.