mshunshin / SegNetCMR

A Tensorflow implementation of SegNet for cardiac MRI segmentation
MIT License
74 stars 35 forks source link

Could not add gradient for MaxPoolWithArgMax #4

Open mciky opened 6 years ago

mciky commented 6 years ago

first,Thank u for sharing your program. But I have some Errors about MaxpoolwithArgx. I use tf1.4 and python 3.5.4, but it shows this error:


Could not add gradient for MaxPoolWithArgMax, Likely installed already (tf 1.4) "Registering two gradient with name 'MaxPoolWithArgmax' !(Previous registration was in runcode C:\python35\lib\idlelib\run.py:357)" loading images finished loading images Number of examples found: 526 loading images finished loading images Number of examples found: 279 Last trained iteration was: 0 Exception OOM when allocating tensor with shape[6,64,128,128] [[Node: pool2/conv2_1/conv/Conv2D = Conv2D[T=DT_FLOAT, data_format="NHWC", padding="SAME", strides=[1, 1, 1, 1], use_cudnn_on_gpu=true, _device="/job:localhost/replica:0/task:0/device:GPU:0"](pool1/maxpool1, pool2/conv2_1/conv/kernel/read)]] [[Node: Mean_1/_343 = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/device:CPU:0", send_device="/job:localhost/replica:0/task:0/device:GPU:0", send_device_incarnation=1, tensor_name="edge_3247_Mean_1", tensor_type=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:CPU:0"]()]]

Caused by op 'pool2/conv2_1/conv/Conv2D', defined at: File "", line 1, in File "C:\python35\lib\idlelib\run.py", line 130, in main ret = method(*args, kwargs) File "C:\python35\lib\idlelib\run.py", line 357, in runcode exec(code, self.locals) File "B:\SegNetCMR-master\train.py", line 124, in main() File "B:\SegNetCMR-master\train.py", line 46, in main logits, softmax_logits = tfmodel.inference(images, class_inc_bg=2) File "B:\SegNetCMR-master\tfmodel\inference.py", line 46, in inference net = c2rb(net, 128, [3, 3], scope='conv2_1') File "B:\SegNetCMR-master\tfmodel\inference.py", line 28, in c2rb name='conv') File "C:\python35\lib\site-packages\tensorflow\python\layers\convolutional.py", line 608, in conv2d return layer.apply(inputs) File "C:\python35\lib\site-packages\tensorflow\python\layers\base.py", line 671, in apply return self.call(inputs, *args, *kwargs) File "C:\python35\lib\site-packages\tensorflow\python\layers\base.py", line 575, in call outputs = self.call(inputs, args, kwargs) File "C:\python35\lib\site-packages\tensorflow\python\layers\convolutional.py", line 167, in call outputs = self._convolution_op(inputs, self.kernel) File "C:\python35\lib\site-packages\tensorflow\python\ops\nn_ops.py", line 835, in call return self.conv_op(inp, filter) File "C:\python35\lib\site-packages\tensorflow\python\ops\nn_ops.py", line 499, in call return self.call(inp, filter) File "C:\python35\lib\site-packages\tensorflow\python\ops\nn_ops.py", line 187, in call name=self.name) File "C:\python35\lib\site-packages\tensorflow\python\ops\gen_nn_ops.py", line 630, in conv2d data_format=data_format, name=name) File "C:\python35\lib\site-packages\tensorflow\python\framework\op_def_library.py", line 787, in _apply_op_helper op_def=op_def) File "C:\python35\lib\site-packages\tensorflow\python\framework\ops.py", line 2956, in create_op op_def=op_def) File "C:\python35\lib\site-packages\tensorflow\python\framework\ops.py", line 1470, in init self._traceback = self._graph._extract_stack() # pylint: disable=protected-access

ResourceExhaustedError (see above for traceback): OOM when allocating tensor with shape[6,64,128,128] [[Node: pool2/conv2_1/conv/Conv2D = Conv2D[T=DT_FLOAT, data_format="NHWC", padding="SAME", strides=[1, 1, 1, 1], use_cudnn_on_gpu=true, _device="/job:localhost/replica:0/task:0/device:GPU:0"](pool1/maxpool1, pool2/conv2_1/conv/kernel/read)]] [[Node: Mean_1/_343 = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/device:CPU:0", send_device="/job:localhost/replica:0/task:0/device:GPU:0", send_device_incarnation=1, tensor_name="edge_3247_Mean_1", tensor_type=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:CPU:0"]()]]

Checkpoint Saved Stopping