Closed Yzichen closed 1 year ago
Why does the cuda memory footprint keep changing during training.
Sorry, I haven't encountered this problem before.
Why does the cuda memory footprint keep changing during training. 注释掉lru_cache就可以了,不然服务器内存不够会爆掉 @functools.lru_cache(maxsize=32) def create_full_step_id(shape):
Why does the cuda memory footprint keep changing during training. 注释掉lru_cache就可以了,不然服务器内存不够会爆掉
def create_full_step_id(shape):
Why does the cuda memory footprint keep changing during training.