NifTK / NiftyNet

[unmaintained] An open-source convolutional neural networks platform for research in medical image analysis and image-guided therapy
http://niftynet.io
Apache License 2.0
1.37k stars 404 forks source link

How to print out dice scores during training? #294

Closed chengjianhong closed 5 years ago

chengjianhong commented 5 years ago

Hello, I want to print out the dice scores during training. But I can't find the print statement and I don't know how to do?

zheng-xing commented 5 years ago

Hi @chengjianhong ,

You can add a line in the source code after calculating the dice_score.

For example,

dice_score = dice_numerator / (dice_denominator + epsilon_denominator)

# Add the line below if you want to see the dice scores for all classes.
dice_score = tf.Print(dice_score, [dice_score], "Dice score for all classes: ")

#### Add these two lines instead if you want to see a combined single averaged dice score
## avg_dice_score = tf.reduce_mean(dice_score)
## avg_dice_score = tf.Print(avg_dice_score, [avg_dice_score], "Dice score: ")

The tf.Print just add a printing node in the graph and does not change the values of the variable.

chengjianhong commented 5 years ago

Thank you! I do as your saying and the print is as follow. But I can't understand what is meaning about '[dice_score]' ?

?[1mINFO:niftynet:?[0m training iter 136, loss=0.39044004678726196 (6.056424s) Dice score for all classes: [0.948326707 0.106993854]

zheng-xing commented 5 years ago

Thank you! I do as your saying and the print is as follow. But I can't understand what is meaning about '[dice_score]' ?

?[1mINFO:niftynet:?[0m training iter 136, loss=0.39044004678726196 (6.056424s) Dice score for all classes: [0.948326707 0.106993854]

Any variables in the "[]" will be printed. You can print more than one variables if you want.

chengjianhong commented 5 years ago

Thank you very much. I hava another question. Can you tell me where is the print ststement as follows in niftynet during training?

INFO:niftynet: training iter 20145, loss=0.601132333278656 (1.187652s) INFO:niftynet: training iter 20146, loss=0.4652557373046875 (1.285409s) INFO:niftynet: training iter 20147, loss=0.273823618888855 (1.195902s) INFO:niftynet: training iter 20148, loss=0.3052770495414734 (1.175462s) INFO:niftynet: training iter 20149, loss=0.6012360453605652 (1.207522s) INFO:niftynet: training iter 20150, loss=0.29241740703582764 (28.400912s) INFO:niftynet: training iter 20151, loss=0.20218241214752197 (1.221758s) INFO:niftynet: training iter 20152, loss=0.29352718591690063 (1.236689s) INFO:niftynet: training iter 20153, loss=0.3512016534805298 (1.274734s)

zheng-xing commented 5 years ago

Wow I was not finding this easy. Maybe someone else can pointing the right location to us?

chengjianhong commented 5 years ago

I find it in segmentation_application.py , the print statement is like as follows:

outputs_collector.add_to_collection( var=data_loss, name='loss', average_over_devices=True, summary_type='scalar', collection=TF_SUMMARIES)

And do you know what is the means about the tf.reduce_mean(image) ?

outputs_collector.add_to_collection( var=tf.reduce_mean(image), name='mean_image', average_over_devices=False, summary_type='scalar', collection=CONSOLE)

zheng-xing commented 5 years ago

Yeah I found this too. But I'm not sure about the need to visualize tf.reduce_mean(image).

zheng-xing commented 5 years ago

I found the location formatting these print statements. They are in engine/application_iteration.py where you can find variable "CONSOLE_FORMAT" and "to_console_string" function.

Thank you very much. I hava another question. Can you tell me where is the print ststement as follows in niftynet during training?

INFO:niftynet: training iter 20145, loss=0.601132333278656 (1.187652s) INFO:niftynet: training iter 20146, loss=0.4652557373046875 (1.285409s) INFO:niftynet: training iter 20147, loss=0.273823618888855 (1.195902s) INFO:niftynet: training iter 20148, loss=0.3052770495414734 (1.175462s) INFO:niftynet: training iter 20149, loss=0.6012360453605652 (1.207522s) INFO:niftynet: training iter 20150, loss=0.29241740703582764 (28.400912s) INFO:niftynet: training iter 20151, loss=0.20218241214752197 (1.221758s) INFO:niftynet: training iter 20152, loss=0.29352718591690063 (1.236689s) INFO:niftynet: training iter 20153, loss=0.3512016534805298 (1.274734s)

chengjianhong commented 5 years ago

Sorry, I can't find it.

ericspod commented 5 years ago

The comment refers to this line: https://github.com/NifTK/NiftyNet/blob/dev/niftynet/engine/application_iteration.py#L177

If there's nothing more to add I'm going to close this one.