clab / dynet

DyNet: The Dynamic Neural Network Toolkit
Apache License 2.0
3.42k stars 704 forks source link

Has the upgrade in the past two years solved any problem of memory leaks? #1538

Open liu946 opened 5 years ago

liu946 commented 5 years ago

We used Dynet to build a semantic role labeling model consisting of multiple LSTMs. During the long-term deployment and prediction of the model deployment online, the memory continues to increase and maybe there is a memory leak somewhere in dynet.

The code calling dynet is at https://github.com/HIT-SCIR/ltp/tree/master/src/srl. (the exact location is unknown. We have checked our code and all 'new's have been handled well.)

Our dynet version is a copy at https://github.com/HIT-SCIR/ltp/tree/master/thirdparty/dynet two years ago.We have try to use dynamic and static memory check tools but found nothing.

Is there anyone else who ever experienced the same problem? Is there any fix about some bugs like this, and whether this will be solved if we just update our dynet copy version.

Looking forward to your reply. Thank you.

neubig commented 5 years ago

Thanks for noting this! It will be hard for us to debug this ourself, but maybe you could run a memory profiler such as valgrind to identify memory leaks? If you can find the place that seems to be causing the leak we can try to fix it.

cydur commented 4 years ago

I have at least one hint: When compiled with MKL only then the memory usage is constant. But when compiled additionally with CUDA and CUDNN then there is severe memory leakage.

cydur commented 4 years ago

Do you use CUDNN?