jakeret / tf_unet

Generic U-Net Tensorflow implementation for image segmentation
GNU General Public License v3.0
1.9k stars 748 forks source link

How to fine-tune a U-Net pre-trained using this library? #253

Open pity2003 opened 5 years ago

pity2003 commented 5 years ago

Given I have pre-trained a model using the code, how to fine-tune it using a different dataset? How to replace the soft-max or freeze Conv layers?

Thanks a lot.

jakeret commented 5 years ago

The package doesn't provide an out of the box solution for this. You could use the list ofvariables and pass and adapter version of it to the minimize function

pity2003 commented 5 years ago

The package doesn't provide an out of the box solution for this. You could use the list ofvariables and pass and adapter version of it to the minimize function

Thank you very much for the reply. I had seen some people did fine-tuning using the method that you mentioned. But for the source code, where and how can I do it if I would like to replace the softmax? In the "init" of Class "unet", I can see "logits, self.variables, self.filters, self.offset = create_conv_net(self.x, self.keep_prob, channels, n_class, layers,...)". Do I need to do something for "logits"?

It would be great if you could give me a specific example for doing that.

Thanks again.

jakeret commented 5 years ago

https://github.com/jakeret/tf_unet/search?q=softmax&unscoped_q=softmax

pity2003 commented 5 years ago

https://github.com/jakeret/tf_unet/search?q=softmax&unscoped_q=softmax

Sorry - I did not follow you. Have no idea why you sent me the link of two source files.

It would be appreciated if you could provide a specific example for doing fine-tuning with you implementation.

Thanks.