Open Think-42 opened 7 years ago
While the model was loading, i noticed that there are many conv-layers which have a 512 in them. Is this the imagesize it thinks, i have?
conv1_1: 64 3 3 3
conv1_2: 64 64 3 3
conv2_1: 128 64 3 3
conv2_2: 128 128 3 3
conv3_1: 256 128 3 3
conv3_2: 256 256 3 3
conv3_3: 256 256 3 3
conv3_4: 256 256 3 3
conv4_1: 512 256 3 3
conv4_2: 512 512 3 3
conv4_3: 512 512 3 3
conv4_4: 512 512 3 3
conv5_1: 512 512 3 3
conv5_2: 512 512 3 3
conv5_3: 512 512 3 3
conv5_4: 512 512 3 3
fc6: 1 1 25088 4096
fc7: 1 1 4096 4096
fc8: 1 1 4096 1000
@Superhirn No, that is the number of channels in the layer.
I also have memory issues running the first example. First I get a warning (googleing around I believe this is ok as I don't hit the hard limit of 1GB)
~/Desktop/src/deep-photo-styletransfer$ th neuralstyle_seg.lua -content_image examples/input/in1.png -style_image examples/style/tar1.png -content_seg examples/segmentation/in1.png -style_seg examples/segmentation/tar1.png -index 1 -serial examples/tmp_results
gpu, idx = 0 1
[libprotobuf WARNING google/protobuf/io/coded_stream.cc:537] Reading dangerously large protocol message. If the message turns out to be larger than 1073741824 bytes, parsing will be halted for security reasons. To increase the limit (or to disable these warnings), see CodedInputStream::SetTotalBytesLimit() in google/protobuf/io/coded_stream.h.
[libprotobuf WARNING google/protobuf/io/coded_stream.cc:78] The total number of bytes read was 574671192
Successfully loaded models/VGG_ILSVRC_19_layers.caffemodel
conv1_1: 64 3 3 3
But then
Setting up content layer 23 : relu4_2
THCudaCheck FAIL file=/tmp/luarocks_cutorch-scm-1-5003/cutorch/lib/THC/generic/THCStorage.cu line=66 error=2 : out of memory
I have a NVIDIA GeForce GTX 1060 6GB.
Is there some configuration I have missed?
EDIT: now I saw this issue #42 with other suggestions... checking it out...
I came across this very interesting github project. I've been working around with jcjohnson's project a lot, but this seams promising too ;)
The only Problem is that i ran into the "out of memory" - problem...
This is my input:
th neuralstyle_seg.lua -content_image PersonalTest/face_man_eye.png -style_image PersonalTest/style.png -content_seg PersonalTest/black.png -style_seg PersonalTest/black.png -index 1 -num_iterations 100 -save_iter 25 -print_iter 1 -backend cudnn -gpu 0
and then this cames along:
The image size is by the way 100 x 100 pixels and i've a GTX 660, which should be fine with that.
Thank you in advance for your help.