92xianshen / refined-unet-v3

MIT License
3 stars 1 forks source link

Gradient Error (No gradient defined for operation 'bilateral_layer_1/Lu_10' (op type : Lu)) in CRFLayer #1

Open rishabh316 opened 1 year ago

rishabh316 commented 1 year ago

I applied this model on my dataset of images, converted them into arrays and feed them to the model. The model gets compiled after that. I am also getting the model summary, but when i try to fit the model using model.fit as : unet_model.fit(x, y, epochs=10) I get the following error:

StagingError: in user code:

File "c:\users\dell\appdata\local\programs\python\python39\lib\site-packages\keras\engine\training.py", line 1284, in train_function  *
    return step_function(self, iterator)  #here
File "c:\users\dell\appdata\local\programs\python\python39\lib\site-packages\keras\engine\training.py", line 1268, in step_function  **
    outputs = model.distribute_strategy.run(run_step, args=(data,))  #here
File "c:\users\dell\appdata\local\programs\python\python39\lib\site-packages\keras\engine\training.py", line 1249, in run_step  **
    outputs = model.train_step(data)  #here
File "c:\users\dell\appdata\local\programs\python\python39\lib\site-packages\keras\engine\training.py", line 1054, in train_step
    self.optimizer.minimize(loss, self.trainable_variables, tape=tape)  #here
File "c:\users\dell\appdata\local\programs\python\python39\lib\site-packages\keras\optimizers\optimizer.py", line 542, in minimize
    grads_and_vars = self.compute_gradients(loss, var_list, tape)  #here
File "c:\users\dell\appdata\local\programs\python\python39\lib\site-packages\keras\optimizers\optimizer.py", line 275, in compute_gradients
    grads = tape.gradient(loss, var_list)  #here

LookupError: No gradient defined for operation'bilateral_layer_1/Lu_10' (op type: Lu). In general every operation must have an associated `@tf.RegisterGradient` for correct autodiff, which this op is lacking. If you want to pretend this operation is a constant in your program, you may insert `tf.stop_gradient`. This can be useful to silence the error in cases where you know gradients are not needed, e.g. the forward pass of tf.custom_gradient. Please see more details in https://www.tensorflow.org/api_docs/python/tf/custom_gradient.

Please provide the suitable solution for this problem asap. Notify me if any particular file or code is required.

92xianshen commented 1 year ago

Dear @rishabh316 ,

Thank you for your inquiry.

But I am very sorry that there is currently no auto-grad op defined for our CRF module, and therefore it cannot be applied to model training. It is suggested that our CRF module can be used as a standalone processing or as a plug-in module for the well-trained model.

I hope my reply can be accepted. And could you kindly cite our relevant research if our code is helpful?

Best regards,

Libin