Closed CMCI closed 5 years ago
Dear CMCI, I am quite sure the problem is the csbdeep version. we are currently only compatible with csbdeep 0.3.0. Reinstalling n2v ("pip install n2v") should fix it, it should automatically install csbdeep 0.3.0. However, If you do need to use csbdeep 0.4.0 for a different project, we recommend to use it in a separate conda environment until we achieve compatibility.
Please let me know if this fixes your problem.
Hi Alex,
Yes, it did. Thanks!
Hello, similar to the error reported yesterday, I am getting the following trace when running 3D data (including the provided example):
File "n2v_CL.py", line 31, in
history = model.train(X, X_val)
File "/home/shared/chris.law/n2v/n2v/models/n2v_standard.py", line 218, in train
callbacks=self.callbacks, verbose=1)
File "/home/shared/chris.law/.local/lib/python3.6/site-packages/keras/legacy/interfaces.py", line 91, in wrapper
return func(*args, *kwargs)
File "/home/shared/chris.law/.local/lib/python3.6/site-packages/keras/engine/training.py", line 1418, in fit_generator
initial_epoch=initial_epoch)
File "/home/shared/chris.law/.local/lib/python3.6/site-packages/keras/engine/training_generator.py", line 94, in fit_generator
callbacks.set_model(callback_model)
File "/home/shared/chris.law/.local/lib/python3.6/site-packages/keras/callbacks.py", line 54, in set_model
callback.set_model(model)
File "/home/shared/chris.law/.local/lib/python3.6/site-packages/csbdeep/utils/tf.py", line 213, in set_model
self.gt_outputs = [K.placeholder(shape=_gt_shape(K.int_shape(x))) for x in self.model.outputs]
File "/home/shared/chris.law/.local/lib/python3.6/site-packages/csbdeep/utils/tf.py", line 213, in
self.gt_outputs = [K.placeholder(shape=_gt_shape(K.int_shape(x))) for x in self.model.outputs]
File "/home/shared/chris.law/.local/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py", line 517, in placeholder
x = tf.placeholder(dtype, shape=shape, name=name)
File "/home/shared/chris.law/.conda/envs/PythonGPU_CL/lib/python3.6/site-packages/tensorflow/python/ops/array_ops.py", line 1747, in placeholder
return gen_array_ops.placeholder(dtype=dtype, shape=shape, name=name)
File "/home/shared/chris.law/.conda/envs/PythonGPU_CL/lib/python3.6/site-packages/tensorflow/python/ops/gen_array_ops.py", line 5206, in placeholder
"Placeholder", dtype=dtype, shape=shape, name=name)
File "/home/shared/chris.law/.conda/envs/PythonGPU_CL/lib/python3.6/site-packages/tensorflow/python/framework/op_def_library.py", line 787, in _apply_op_helper
op_def=op_def)
File "/home/shared/chris.law/.conda/envs/PythonGPU_CL/lib/python3.6/site-packages/tensorflow/python/util/deprecation.py", line 488, in new_func
return func( args, **kwargs)
File "/home/shared/chris.law/.conda/envs/PythonGPU_CL/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3274, in create_op
op_def=op_def)
File "/home/shared/chris.law/.conda/envs/PythonGPU_CL/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 1770, in init
self._traceback = tf_stack.extract_stack()
InvalidArgumentError (see above for traceback): You must feed a value for placeholder tensor 'Placeholder' with dtype float and shape [?,?,?,?,1] [[node Placeholder (defined at /home/shared/chris.law/.local/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:517) = Placeholder[dtype=DT_FLOAT, shape=[?,?,?,?,1], _device="/job:localhost/replica:0/task:0/device:GPU:0"]()]] [[{{node lambda_2/clip_by_value/_809}} = _Recv[client_terminated=false, recv_device="/job:localhost/replica:0/task:0/device:CPU:0", send_device="/job:localhost/replica:0/task:0/device:GPU:0", send_device_incarnation=1, tensor_name="edge_451_lambda_2/clip_by_value", tensor_type=DT_FLOAT, _device="/job:localhost/replica:0/task:0/device:CPU:0"]()]]
I am using the following versions: n2v = 0.1.4 (updated this 10am EST on 2019/07/16) tensorflow GPU = 1.12.0 keras = 2.2.4 csbdeep = 0.4.0
I have successfully run the previous version of n2v (before the commits on June 17th, I think, though I'm not certain - it was necessary to put the data into separate numpy arrays for validation and training).