HFAiLab / OpenCastKit

The open-source solutions of FourCastNet and GraphCast
MIT License
305 stars 79 forks source link

Insufficient memory during inference #17

Closed Jeffrey-JDong closed 1 year ago

Jeffrey-JDong commented 1 year ago

May I ask if you have encountered such a situation, during reasoning, as the prediction step size increases, the memory continues to grow, finally,resulting in insufficient memory