Xiaoxun-Gong / DeepH-E3

MIT License
71 stars 17 forks source link

Memory leakage for the inference step #3

Closed choi-geunseok closed 1 year ago

choi-geunseok commented 1 year ago

I'm experiencing a memory leak during the inference step in the def eval(). This phenomenon occurs when using the GPU as the device, not the CPU. It seems that about 200 megabytes of RAM memory continue to accumulate with each case iteration. When I perform inference for several thousands of cases, memory builds up until the task gets killed. The issue isn't something that can be simply resolved by dividing the task into several times. The point where memory accumulates corresponds to output, output_edge = net(batch.to(device=eval_config.device)). What do you think might be the cause?