Open nivangio opened 7 years ago
+1
I'm having the same issue. Any updates?
I managed to solve it by adding some memory freeing lines in Python's detect and classify functions
def classify(net, meta, im):
out = predict_image(net, im)
res = []
for i in range(meta.classes):
res.append((meta.names[i], out[i]))
res = sorted(res, key=lambda x: -x[1])
free_image(im)
return res
def detect(net, meta, image, thresh=.5, hier_thresh=.5, nms=.45):
im = load_image(image, 0, 0)
boxes = make_boxes(net)
probs = make_probs(net)
num = num_boxes(net)
network_detect(net, im, thresh, hier_thresh, nms, boxes, probs)
res = []
for j in range(num):
for i in range(meta.classes):
if probs[j][i] > 0:
res.append((meta.names[i], probs[j][i], (boxes[j].x, boxes[j].y, boxes[j].w, boxes[j].h)))
res = sorted(res, key=lambda x: -x[1])
free_image(im)
lib.free(boxes)
free_ptrs(cast(probs, POINTER(c_void_p)), num)
return res
I solve this issue by modifying the original darknet.py by add:
free_network = lib.free_network reset_rnn.argtypes = [c_void_p]
and
def freeNetwork(net): free_network(net) return 0
this will create a new interface to free the memory usage after loading the net and finish the detection
example code will be something like this import darknet as dn dn.set_gpu(0) net = dn.load_net(bytes(cfg_name, encoding='utf-8'), bytes(weight_name, encoding='utf-8'), 0) meta = dn.load_meta(bytes(meta_name,encoding='utf-8')) outs = dn.detect(net, meta, bytes(test_img_name,encoding='utf-8'), thresh, hier_thresh, nms) dn.freeNetwork(net)
It seems that detect is not correctly freeing memory in Python. I ran detection on multiple images (basically on a video) and found out that memory leaks slowly. After isolating the different components I found out it was an issue of detect function. Is it possible that free_imgs() and free_ptrs() are not working correctly.
If I comment out the detect line, the memory usage stays stable, tht's why I am assuming the issue relies there and not anywhere else. Please let me know if you see sth different