Closed ilia-nikiforov closed 7 years ago
The script doesn't actually do anything with the input data itself, except use it to calculate the number of iterations needed to cover the entire test dataset. So the easiest way to make it work is just to I passed the number of iterations required to cover the data sets to the script manually.
@ilia-nikiforov Yeah, very true.
If someone else gets here, find the line train_ims, train_labs = extract_dataset(testable_msg)
. Here the only place train_ims
is used is to provide the train_size
.
Better do it manually in case of dealing with k-channels.
Also, change the minibatch_size
initialization to :
minibatch_size = testable_msg.layer[0].data_param.batch_size
Hi guys, I'm trying to do the same. I avoided the assert in the compute_bn_statistics.py for denseimage data and modified according to kislayabhi.
But i got an error 13687 insert_splits.cpp:35] unknown blob input data to layer 0 .
Sorry i'm really new to caffe and segnet. Can you give me a hint other things i should modify to adapt for lmdb data input to segnet (i'm also thinking of changing to hdf5 for 16bit data)
Thank you.
Building BN calc net...
Calculate BN stats...
Traceback (most recent call last):
File "./Scripts/compute_bn_statistics_modified.py", line 181, in
As several users have before me, I would like to use SegNet to classify data with an arbitrary number of channels, and using LMDB seems to be the way to go. The compute_bn_statistics.py script needs to be modified in order to use LMDB data. In issues https://github.com/alexgkendall/caffe-segnet/issues/8 https://github.com/alexgkendall/caffe-segnet/issues/32 @mtrth, @cslxiao, and @alexgkendall speak about having modified the script, but I have not been able to find any specifics regarding the modifications. Can anyone help?