Closed katopz closed 6 months ago
Hi @katopz I would like to know how to reproduce this: Does it say hi once and get the memory increase from 5138MB to 14007MB?
I said hi 3 times but github not allow me to attach huge log so I captured only latest one.
I said hi 3 times but github not allow me to attach huge log so I captured only latest one.
Got it. Will try to reproduce and get you back.
I can reproduce this issue. Now, I am digging into the model loading phase to check if there is anything wrong that caused this issue.
Hi @katopz I rebuilt the asset to add a patch to it, could you please try to install it again and check if the memory is still keeping growing?
Here is what I get on A10G. It will grow the first time to handle the model. And there will be small growth because the context changes.
Working now. Thanks! 👍
+-----------------------------------------------------------------------------+
| NVIDIA-SMI 525.147.05 Driver Version: 546.01 CUDA Version: 12.3 |
|-------------------------------+----------------------+----------------------+
| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|===============================+======================+======================|
| 0 NVIDIA GeForce ... On | 00000000:01:00.0 Off | Off |
| 0% 46C P2 96W / 450W | 5760MiB / 24564MiB | 0% Default |
| | | N/A |
+-------------------------------+----------------------+----------------------+
+-----------------------------------------------------------------------------+
| Processes: |
| GPU GI CI PID Type Process name GPU Memory |
| ID ID Usage |
|=============================================================================|
| 0 N/A N/A 579 G /Xwayland N/A |
| 0 N/A N/A 9339 C /wasmedge N/A |
+-----------------------------------------------------------------------------+
Say
hi
and get memory stack up and never decrease until exit.Command
Output
nvidia-smi
Only me?