np_resource = np.dtype([("resource", np.ubyte, 1)])
[2021-03-01 21:48:18,200] [train] [INFO] define model+
[0301 21:48:18 @parallel.py:219] [MultiProcessRunner] Will fork a dataflow more than one times. This assumes the datapoints are i.i.d.
Traceback (most recent call last):
File "train.py", line 50, in
df = get_dataflow_batch(args.datapath, True, args.batchsize)
File "/home/eren/hand_detector_train/hands_dataset.py", line 74, in get_dataflow_batch
ds = BatchData(ds, batchsize)
File "/home/eren/.local/lib/python3.6/site-packages/tensorpack/dataflow/common.py", line 98, in init
assert batch_size <= len(ds)
AssertionError
I got that error after run python train.py --datapath=$HANDS_SNYTH_PATH . Can anyone help me ?
np_resource = np.dtype([("resource", np.ubyte, 1)]) [2021-03-01 21:48:18,200] [train] [INFO] define model+ [0301 21:48:18 @parallel.py:219] [MultiProcessRunner] Will fork a dataflow more than one times. This assumes the datapoints are i.i.d. Traceback (most recent call last): File "train.py", line 50, in
df = get_dataflow_batch(args.datapath, True, args.batchsize)
File "/home/eren/hand_detector_train/hands_dataset.py", line 74, in get_dataflow_batch
ds = BatchData(ds, batchsize)
File "/home/eren/.local/lib/python3.6/site-packages/tensorpack/dataflow/common.py", line 98, in init
assert batch_size <= len(ds)
AssertionError
I got that error after run python train.py --datapath=$HANDS_SNYTH_PATH . Can anyone help me ?