yasenh / libtorch-yolov5

A LibTorch inference implementation of the yolov5
MIT License
372 stars 114 forks source link

一定概率会出现内存会一直涨 #28

Closed leoxxxxxD closed 3 years ago

leoxxxxxD commented 3 years ago

每次运行可能会出现内存泄露?现象就是内存不断占用,但是一直没有前向

18242360613 commented 3 years ago

出现了相同的问题,问题出现在load模型的时候。

yasenh commented 3 years ago

@xw0629 @18242360613 thanks for the feedback! But I cannot replicate the issue locally, could you provide your testing environment (OS, CUDA version, libtorch version)?

liubamboo commented 3 years ago

I have the same problem. my environment is ubuntu 18.04, cuda 10.2 libtorch 1.8.0

the bug is caused by the module_ = torch::jit::load(model_path); But I don't know how to solve it.

liubamboo commented 3 years ago

envs: ubuntu 18.04, libtorch 1.8.1, cxxopts

Hello, When I run the code from https://github.com/yasenh/libtorch-yolov5.git. the program memery is still increasing. After debugging, I find the key is that cxxopts conflict with libtorch.

when I use cxxopts in my main.cpp and load torch script by torch::jit::load, the bug sometime appears. The program memory is still increasing quickly and finally the program was killed by the operater system, and then there are some infomation like: Process finished with exit code 9

Then I didn't use cxxopts and use gflags, The bug disappears.

So I think there are some conflicts between libtorch and cxxopts

monoloxo commented 3 years ago

@liubamboo 您好,我也遇到了同样的问题,我也是定位到loading model那里容易出现:killed。然后我按你的方法,把 std::string weights = opt["weights"].as<std::string>(); 改成 std::string weights = "../weights/best.torchscript-gpu-half.pt"; 但仍然有几率出现错误。。 难道是要把所有用到cxxopts的代码都替换掉吗?

liubamboo commented 3 years ago

@monoloxo 我使用glog替换掉cxxopts,就没有问题了,替换也比较简单,安装glog然后更改主函数开头输入参数部分即可。详细原因我还没有找到

monoloxo commented 3 years ago

@liubamboo 感谢,我试一下

monoloxo commented 3 years ago

@liubamboo 打扰一下,我刚安装之后试着写了一下,出现了好多错误。。发现要改好多东西,(刚学C++,很多不太会的),方便的话可以看一下您使用的方法吗?

monoloxo commented 3 years ago

感谢 @liubamboo ,确实是cxxopts的问题,我把相关的参数都改成直接输入的方式,问题就消失了。

yasenh commented 3 years ago

@liubamboo @monoloxo feel free to create a PR to fix it.