Open fspider opened 5 years ago
def _run_in_batches(f, data_dict, out, batch_size):
data_len = len(out)
num_batches = int(data_len / batch_size)
print("123")
s, e = 0, 0
print("124")
for i in range(num_batches):
print("1240")
s, e = i * batch_size, (i + 1) * batch_size
print("1241")
batch_data_dict = {k: v[s:e] for k, v in data_dict.items()}
print("1242")
out[s:e] = f(batch_data_dict)
print("1243")
print("125")
if e < len(out):
print("1250")
batch_data_dict = {k: v[e:] for k, v in data_dict.items()}
print("1251")
out[e:] = f(batch_data_dict)
print("1252")
123
124
1240
1241
1242
So it seems not work with following function out[s:e] = f(batch_data_dict)
Please Help me!
i have the same problem . how do you solve it?
Same problem here! Any solution?
I found this issue with log
Code is like this
generate_detection.py
ENVIRONMENT
Log RESULT
So i found the _run_in_batches function not works with multiprocessing
How can i run deep sort module with multiprocessing?