amazon-science / earth-forecasting-transformer

Official implementation of Earthformer
Apache License 2.0
350 stars 60 forks source link

reproduce the results on ICAR-ENSO datasest #5

Closed tianzhou2011 closed 1 year ago

tianzhou2011 commented 1 year ago

Dear authors Thank you for the code release. It's very informative. However, I wonder if there are some arguments I need to change to reproduce the ENSO's results. I run the train_cuboid_enso.py with cfg.yaml. it end up with NiNO_M:65.7, WM: 1.933 and mse:0.00272 far from the 0.7239,2.214,0.00255 reported in the paper; I also run a cfg.yaml with 8 global vectors, it gave even worse results compared to the original setting.

gaozhihan commented 1 year ago

Thanks for pointing it out. We have verified that training using the current default cfg.yaml indeed results in suboptimal performance. We have reproduced our result(C-Nino3.4-M $=0.7322$, C-Nino3.4-WM $=2.263$, $MSE = 2.596\times10^{-4}$) by simply changing the configs of batch_size (total_batch_size: 64 and micro_batch_size: 16, or total_batch_size: 64 and micro_batch_size: 8). In short, you should be able to reproduce the reported results in our paper by setting total_batch_size: 64 in the cfg.yaml. Could you please try it again with total_batch_size: 64 to see if it works correctly? Please feel free to contact us if you have any further questions. We will also soon submitting PRs that update training scripts, test-cases, inference examples with the pretrained model.

tianzhou2011 commented 1 year ago

Thanks for your fixing.