I use only 6 frame data(30M),when I train the model,I found that it print "std::alloc" and the 16G memory +16G swap is out.I don't know why it use so much memory????what should I do? I change the "queue_size=6, use_multi_process_num=3",it can train,but very slow.
my computer config:
gtx1070 8G
i7
RAM 16G
You'd better to use a large memory since the data loader will consume so much memory. >32GB will be nice. Of course you can also try to increase the swap size.
I use only 6 frame data(30M),when I train the model,I found that it print "std::alloc" and the 16G memory +16G swap is out.I don't know why it use so much memory????what should I do? I change the "queue_size=6, use_multi_process_num=3",it can train,but very slow. my computer config: gtx1070 8G i7 RAM 16G