Open eugenelet opened 3 years ago
Hi @eugenelet , for improving reproducibility, I have uploaded an improved version of PhyDNet, with separate encoders and decoders (more details in the paper at the CVPR OmniCV workshop 2020: https://openaccess.thecvf.com/content_CVPRW_2020/papers/w38/Le_Guen_A_Deep_Physical_Model_for_Solar_Irradiance_Forecasting_With_Fisheye_CVPRW_2020_paper.pdf). I have also uploaded the pretrained model, which attains MSE=24,19 (better than in the CVPR paper). In particular, we found that the batch size has a crucial impact on performances, we fixed it at 16 for this model. Best, Vincent
Hi @vincent-leguen , thanks for releasing the pretrained model and the updated configuration of the code. I'll re-run the code from scratch to validate the reported performance. This is an interesting field to contribute to.
Eugene
Hi @vincent-leguen , for the recently updated code, do I run using the default configs, i.e. python3 main.py
? At epoch 1000, I obtained MSE of 38.62 which is still off from the reported results.
Hi @vincent-leguen , for the recently updated code, do I run using the default configs, i.e.
python3 main.py
? At epoch 1000, I obtained MSE of 38.62 which is still off from the reported results.
I find their model is sensitive to batch size. U should make sure your batch size is 16. I think maybe the GroupNorm cause it. When I set the batch size with 16, it work well.
Hi Authors,
I ran the code using the default configuration on Moving MNIST by directly executing
python3 main.py
.The final MSE obtained after 1000 epochs is around 75.26 which is far higher than the MSE reported in the paper which is 24.4. Is there anything that I'm missing here? Thanks!
Eugene