BruceResearch / BiTr-Unet

This repo is the source code for [BiTr-Unet: a CNN-Transformer Combined Network for MRI Brain Tumor Segmentation].
Apache License 2.0
32 stars 8 forks source link

DICE #2

Closed H-CODE6 closed 2 years ago

H-CODE6 commented 2 years ago

Hello, your code has been reproduced recently. We can't know the Dice coefficient after the test. How did you get the coefficient

BruceResearch commented 2 years ago

Based on the info on BraTS website, "Annotations comprise the GD-enhancing tumor (ET — label 4), the peritumoral edematous/invaded tissue (ED — label 2), and the necrotic tumor core (NCR — label 1)"

Plus, WT consists of enhancing components, nonenhancing components including necrosis, and edema (ET + NCR + ED). TC consists of enhancing components and nonenhancing components including necrosis (ET + NCR).

Thus, we also need to combine those.

H-CODE6 commented 2 years ago

OK, do you have any codes to evaluate Hausdorff distance, sensitivity and specificity? These indicators are also very important.

BruceResearch commented 2 years ago

Writing them for u now

H-CODE6 commented 2 years ago

ok, thank you very much

BruceResearch commented 2 years ago

Updated, it may contain bug since I can't test it out now, but everything you need is there.

H-CODE6 commented 2 years ago

OK, I'll test it and give you feedback right away

H-CODE6 commented 2 years ago

raceback (most recent call last): File "evaluation.py", line 257, in np.savetxt(s_folder + '/Hausdorffdistance{0:}.txt'.format(test_type), hd) File "<__array_function__ internals>", line 6, in savetxt File "/home/hkw/anaconda3/envs/pytorch/lib/python3.7/site-packages/numpy/lib/npyio.py", line 1381, in savetxt "Expected 1D or 2D array, got %dD array instead" % X.ndim) ValueError: Expected 1D or 2D array, got 0D array instead

BruceResearch commented 2 years ago

No need to save it actually,you may delete this line

BruceResearch commented 2 years ago

Found another error in distance computation, I'll fix it tomorrow

H-CODE6 commented 2 years ago

OK, I look forward to your repair. The running time of the file is very long. I don't know whether there are still problems. I'll give you feedback in time.

BruceResearch commented 2 years ago

distance computation for large array is time consuming.

H-CODE6 commented 2 years ago

OK, I look forward to your repair tomorrow, and I will communicate with you in time.

H-CODE6 commented 2 years ago

tissue type ET dice mean [0.83201768] dice std [0.23055119] sensitivity [0.9673336653678258] specificity [0.9996706719360656] Hausdorff_distance 21.02379604162864 Traceback (most recent call last): File "evaluation.py", line 250, in sensitivity = sensitivity_of_brats_data_set(gt_names, seg_names, type_idx) File "evaluation.py", line 182, in sensitivity_of_brats_data_set temp_sensi = sensitivity_whole(s_volume,g_volume) File "evaluation.py", line 41, in sensitivity_whole return sensitivity(seg>0,ground>0) TypeError: 'list' object is not callable

The results just came out are only et, and the other two items cannot be displayed

BruceResearch commented 2 years ago

It is updated and tested. Dice and Hausodorff distance should be good now. For sensitivity and specificity, I tried several ways and couldn't resolve the error. I would say just looking at Dice and Hausodorff distance is sufficient as they are the official metrics of BraTS. You are welcome to debug yourself to solve the error of sensitivity and specificity.

H-CODE6 commented 2 years ago

ok, thank you very much

H-CODE6 commented 2 years ago

Hello, author. After a lot of tests, I found Hausdorff_ Distance coefficient is always the same in two of ET, TC and wt. I don't know why?

H-CODE6 commented 2 years ago

1. tissue type ET dice mean [0.88853114] dice std [0.1029538] Hausdorff_distance 15.165750888103101 tissue type WT dice mean [0.91242634] dice std [0.06855426] Hausdorff_distance 6.4031242374328485 tissue type TC dice mean [0.91720559] dice std [0.12459188] Hausdorff_distance 15.165750888103101

2. dice mean [0.83913826] dice std [0.20701617] Hausdorff_distance 5.0 tissue type WT dice mean [0.90823157] dice std [0.08411381] Hausdorff_distance 3.1622776601683795 tissue type TC dice mean [0.90465758] dice std [0.14383377] Hausdorff_distance 5.0

3. tissue type ET dice mean [0.83201768] dice std [0.23055119] Hausdorff_distance 5.744562646538029 tissue type WT dice mean [0.90952711] dice std [0.10472148] Hausdorff_distance 14.071247279470288 tissue type TC dice mean [0.88381648] dice std [0.18333644] Hausdorff_distance 14.071247279470288

BruceResearch commented 2 years ago

try now

H-CODE6 commented 2 years ago

Hello, author. After testing, I found Hausdorff_ Distance is different from the previous code results. Can you explain the principle?

BruceResearch commented 2 years ago

instead of differentiating wt, et,tc in the function of hd_of_brats_data_set(), I now wrote separate def hausdorff_whole (),def hausdorff_en (), def hausdorff_whole (),hausdorff_core() to calculate HD for wt, et,tc. Note that the way of differentiating ET for calculating HD is different from calculating Dice, which is why I was wrong last time. For calculating HD for ET, we want Hausdorff_distance(seg!=4,ground!=4), for calculating Dice for ET, we want binary_dice3d(s_volume ==4, g_volume ==4). I realized this by reading "https://github.com/Issam28/Brain-tumor-segmentation/blob/master/evaluation_metrics.py" did so, but I don't know why. You should do really more research to ascertain this or by adopting other trustworthy sources of codes.

H-CODE6 commented 2 years ago

Well, thank you very much.

H-CODE6 commented 2 years ago

Hello, the author, I noticed that all data sizes are trimmed to 128128128 for training and testing in your code. Can you guarantee that the core area of the data will not be trimmed out during trimming?

BruceResearch commented 2 years ago

What I had done for random cropping is the most conventional way. Since BraTS training data is well-crafted (ROIs locate in the center of the image), even though I can't guarantee that no voxel of ROI is cropped, I would say it is safe to do so. However, testing data may be more wild and diverse, and cropping of data augmentation is an open research area where you can develop your idea and be creative to do fancier tricks, which can also be the selling point of your paper. Thus, you may want to import more advanced cropping tricks.

H-CODE6 commented 2 years ago

OK, thank you very much

H-CODE6 commented 2 years ago

Hello, I am sorry to disturb you again. Recently, I was working on the dataset and needed to process all brats2021 data into 128128128 in advance. I haven't found any code that can be used directly. I wonder if you have any codes in this area. Thank you very much for your help

BruceResearch commented 2 years ago

monai.transforms.SpatialCrop: General purpose cropper to produce sub-volume region of interest (ROI). If a dimension of the expected ROI size is bigger than the input image size, will not crop that dimension. So the cropped result may be smaller than the expected ROI, and the cropped results of several images may not have exactly the same shape. It can support to crop ND spatial (channel-first) data.

https://docs.monai.io/en/stable/transforms.html#transform

H-CODE6 commented 2 years ago

Thank you very much for your reply. I read all your codes and know your brats Py is the pruning code, which is called in the train. I want to directly use your code to prune the brats data into 128128128, but it has not been successfully implemented. I wonder if you have time to help me. The first step is to prune all the data, and then perform preprocessing, training, etc.

H-CODE6 commented 2 years ago

I want to cut all the data into 128128128 sizes in the first step, and then carry out subsequent training and other operations, which have not been successfully realized. I look forward to your help. Thank you very much

H-CODE6 commented 2 years ago

Hello, author. I'm sorry to disturb you again. I don't know how to run data postprocessing. Can you tell me the detailed running process?