seung-lab / znn-release

Multi-core CPU implementation of deep learning for 2D and 3D sliding window convolutional networks (ConvNets).
GNU General Public License v3.0
94 stars 33 forks source link

Test dataset doesn't run #3

Closed tartavull closed 9 years ago

tartavull commented 9 years ago

ubuntu@ip-172-31-46-238:~/znn-release$ ./bin/znn --options="train.config" FFT Threads initialized

[options] postprocess

[options] path_check Config path [./networks/N3.spec] Load path [empty] terminate called after throwing an instance of 'std::invalid_argument' what(): Non-existent save path [./experiments/] Aborted (core dumped) ubuntu@ip-172-31-46-238:~/znn-release$ mkdir experiments ubuntu@ip-172-31-46-238:~/znn-release$ ./bin/znn --options="train.config" FFT Threads initialized

[options] postprocess

[options] path_check Config path [./networks/N3.spec] Load path [empty] Save path [./experiments/] Hist path [empty] train_range [1] test_range [2]

[net_builder] operable: 1 [net] initialize [net] intialize_weight

[INPUT] Node size: [ 164, 164, 1 ] x 1

[INPUT_C1] Kernel size: [ 3, 3, 1 ] x 48

[C1] Node size: [ 162, 162, 1 ] x 48 Filter size: [ 3, 3, 1 ] Filter stride: [ 3, 3, 1 ] Receive FFT: 0

[C1_C2] Kernel size: [ 4, 4, 1 ] x 2304 Sparseness: [ 3, 3, 1 ] Real filter size: [ 10, 10, 1 ]

[C2] Node size: [ 151, 151, 1 ] x 48 Filter size: [ 2, 2, 1 ] Filter stride: [ 2, 2, 1 ] Sparseness: [ 3, 3, 1 ] Real filter size: [ 4, 4, 1 ] Receive FFT: 0

[C2_C3] Kernel size: [ 4, 4, 1 ] x 2304 Sparseness: [ 6, 6, 1 ] Real filter size: [ 19, 19, 1 ]

[C3] Node size: [ 130, 130, 1 ] x 48 Filter size: [ 2, 2, 1 ] Filter stride: [ 2, 2, 1 ] Sparseness: [ 6, 6, 1 ] Real filter size: [ 7, 7, 1 ] Receive FFT: 0

[C3_FC] Kernel size: [ 3, 3, 1 ] x 4800 Sparseness: [ 12, 12, 1 ] Real filter size: [ 25, 25, 1 ]

[FC] Node size: [ 100, 100, 1 ] x 100 Receive FFT: 0

[FC_OUTPUT] Kernel size: [ 1, 1, 1 ] x 200 Sparseness: [ 12, 12, 1 ] Real filter size: [ 1, 1, 1 ]

[OUTPUT] Node size: [ 100, 100, 1 ] x 2 Receive FFT: 0

[network] load_input Loading [./dataset/ISBI2012/spec/batch1.spec] Input sizes: [ 164, 164, 1 ] Output sizes: [ 100, 100, 1 ] [ 100, 100, 1 ]

Loading input [INPUT1] Preprocessing [standard2D] completed. (Elapsed time: 0.0455943 secs)

Loading label [LABEL1] Preprocessing [binary_class] completed. (Elapsed time: 0.0177708 secs)

Loading mask [MASK1]

[volume_data_provider] collect_valid_locations Upper corner: [ 82, 82, 0 ] Lower corner: [ 175, 175, 30 ] Number of valid samples: 259470 Completed. (Elapsed time: 0.00870137 secs)

[network] load_input Loading [./dataset/ISBI2012/spec/batch2.spec] Input sizes: [ 164, 164, 1 ] Output sizes: [ 100, 100, 1 ] [ 100, 100, 1 ]

Loading input [INPUT1] Preprocessing [standard2D] completed. (Elapsed time: 0.0448642 secs)

Assertion outszs.size() == lbls_.size() failed file: src/core/../front_end/data_provider/volume_data_provider.hpp line: 223 Aborted (core dumped)

tartavull commented 9 years ago

When running the demo datasets, we get a core dumped

tartavull commented 9 years ago

I have just cp batch1.spec batch2.spec, and seems to be trainning now. not sure if it is the right fix