ceccocats / tkDNN

Deep neural network library and toolkit to do high performace inference on NVIDIA jetson platforms
GNU General Public License v2.0
717 stars 209 forks source link

How to release the memory when my program running for a long time? #135

Closed sjtuzbk closed 3 years ago

sjtuzbk commented 3 years ago

I noticed that the DetectionNN class has some member variables of vector class:

std::vector<tk::dnn::box> detected; /*bounding boxes in output*/
std::vector<std::vector<tk::dnn::box>> batchDetected; /*bounding boxes in output*/
std::vector<double> stats; /*keeps track of inference times (ms)*/
std::vector<std::string> classesNames;

Therefore, as the program runs, the memory occupied by vectors will increase. If my program runs for a long time, the memory will eventually be insufficient. So how to release the memory?

Looking forward to your reply!

mive93 commented 3 years ago

Hi @sjtuzbk Every new inference detected and batchDetected are cleaned, while classesNames never changes. The only one is stats, that saves a double for each inference to track the performance. That is true, it should be cleaned time to times.

mive93 commented 3 years ago

Closing for now, feel free to reopen.