Open wwlaoxi opened 6 years ago
@wwlaoxi Hi, have you solved this problems ?
Same problem here. I'm running the notebook in an official Caffe release docker in CPU mode.
@mohamed-ezz @PatrickChrist @FelixGruen Any help/hints towards solving this would be appreciated!
I also achieved the same result. The predicted slice was not a satisfactory result. With regard to the previous issue (#31 ), I tested it in CPU mode and was unacceptable again. @mohamed-ezz @PatrickChrist @FelixGruen Do we have to consider a particular item in the implementation?
Hello @lidaryani , please see this note about the Caffe version and the crop layer https://github.com/IBBM/Cascaded-FCN/issues/3#issuecomment-272343778
@lidaryani I got it working by building Jonlong's caffe version and running Cascaded-FCN with that one. See also the Cascaded-FCN jupyter notebook and issue #3
I used the docker image (Jon Long) on new CT volumes. The inference was impressive. Using Caffe 1.0.0, I am getting identical results to the above Prediction screenshot.
Could somebody please provide an example of updating the model r/e the crop layer-- https://github.com/IBBM/Cascaded-FCN/issues/3#issuecomment-272343778
I used the docker image (Jon Long) on new CT volumes. The inference was impressive. Using Caffe 1.0.0, I am getting identical results to the above Prediction screenshot.
Could somebody please provide an example of updating the model r/e the crop layer-- #3 (comment)
Here is my updated crop layers which appeared to work under Cafe 1.0.0 (AWS Python 3 configured)
layer { name: "crop_d3c-d3cc" type: "Crop" bottom: "d3c" current blob size (1, 512, 64, 64) bottom: "u3a" desired blob size (1, 512, 56, 56) top: "d3cc" crop_param { axis: 2 offset: 4 offset: 4 } } layer {
type: "Crop" bottom: "d2c" (1, 256, 136, 136) bottom: "u2a" desired blob size (1, 256, 104, 104) top: "d2cc" crop_param { axis: 2 offset: 16 offset: 16 } } layer { name: "crop_d1c-d1cc" type: "Crop" bottom: "d1c" (1, 128, 280, 280) bottom: "u1a" desired blob size (1, 128, 200, 200) top: "d1cc" crop_param { axis: 2 offset: 40 offset: 40 } } layer { name: "crop_d0c-d0cc" type: "Crop" bottom: "d0c" (1, 64, 568, 568) bottom: "u0a" *desired blob size (1, 64, 392, 392) top: "d0cc" crop_param { axis: 2 offset: 88 offset: 88 } }
I am down to just the following warning which I hope are related to training and not inference: I0610 23:47:14.142359 4565 net.cpp:744] Ignoring source layer bn_d0b (batch normalization?) I0610 23:47:14.169219 4565 net.cpp:744] Ignoring source layer loss (loss for training?))
@keesh0 , please could you share the source of the new CT volumes you used? Thank you.
Sure.
3DIRCAD dataset-- 20 venous phase enhanced CT volumes from various European hospitals with different CT scanners [test#0, training cfcn network]
TCGA-LIHC is The Cancer Genome Atlas Liver Hepatocellular Carcinoma (TCGA-LIHC) [test set] https://wiki.cancerimagingarchive.net/display/Public/TCGA-LIHC
@keesh0 , very many thanks. Besides, I encountered some bugs whilst trying out this repo, I have opened an issue on it here: https://github.com/IBBM/Cascaded-FCN/issues/34 and I hope the owner will proffer solution to it. Where you able to plot accuracy curve for the predicted images?
I did not try to plot the accuracy curve. Here is my wrapper code-- https://github.com/keesh0/cfcn_test_inference/blob/master/python/test_cascaded_unet_inference.py