isyangshu / MambaMIL

[MICCAI 2024] Official Code for "MambaMIL: Enhancing Long Sequence Modeling with Sequence Reordering in Computational Pathology"
49 stars 1 forks source link

Code for Survival Time and Concordance Index #2

Closed MarioPaps closed 2 months ago

MarioPaps commented 3 months ago

Could you also please provide the implementation of the survival time estimation part and how you handle gradient accummulation?

wyhsleep commented 2 months ago

The implementation of the survival time estimation part is actually in the engine.py. In our implementation, we didn't use gradient accumulation for all the methods but in our next version of code, we will add gradient accumulation.

MarioPaps commented 2 months ago

I see. However, I do not see anywhere the classes of engine.py being called. Will you upload a train.py script?

wyhsleep commented 2 months ago

Yes, we will upload a new version today.

MarioPaps commented 2 months ago

I cannot seem to find the Cox loss function anywhere. Do you do survival analysis using classification bins and cross entropy, but report the c-index for accuracy

wyhsleep commented 2 months ago

Yes, although the Cox proportional hazards model is commonly used for survival analysis, it is also possible to handle survival data by discretizing survival times into different time intervals (classification bins) and using cross-entropy as the loss function to train a classification model. We use nll_surv_loss in utils.survival_loss, you can check it.

MarioPaps commented 2 months ago

Could I ask you something about the pre-processing as well?

I am not following the CLAM workflow, but instead I used a matlab script to create the patches and applied Macenko stain normalization.

When you followed the CLAM workflow, did you apply the default parameters to create the patches and did you then apply stain normalization?

wyhsleep commented 2 months ago

Yes, we use the default setting for TCGA provided by CLAM.

shubhaminnani commented 2 months ago

How to handle the model if patient censoring information is not available?

wyhsleep commented 2 months ago

We didn't use patient cases that lack censoring information.

shubhaminnani commented 2 months ago

Do the model output the bin in which case falls or this gives an value in months? https://github.com/isyangshu/MambaMIL/blob/e1985c1faebd2bcd3aa99630685486713ec53406/utils/survival_core_utils.py#L296

I am particularly interested in finding the values in month. Thanks!

wyhsleep commented 2 months ago

Please refer to dataset/dataset_survival.py

MarioPaps commented 2 months ago

Hello, I have a big problem with CLAM and I would appreciate your advice. I ran this command to obtain h5 files: python create_patches.py --source surformerpat --save_dir surformertiles2 --patch_size 256 --patch_level 2 --seg --patch --stitch

But when I run the command python extract_features_fp.py --data_h5_dir "./surformertiles" --data_slide_dir "./surformerpat" --slide_ext '.svs' --csv_path="./surformertiles/process_list_autogen.csv" --feat_dir "./surformerfeat" --model_name 'uni_v1' --target_patch_size 224,

I get the following error:

Traceback (most recent call last):
File "/rds/general/user/kp4718/home/RCS_help/CLAM/extract_features_fp.py", line 100, in
dataset = Whole_Slide_Bag_FP(file_path=h5_file_path,
File "/rds/general/user/kp4718/home/RCS_help/CLAM/dataset_modules/dataset_h5.py", line 65, in init
self.patch_level = f['coords'].attrs['patch_level']
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "/rds/general/user/kp4718/home/anaconda3/envs/clam_latest/lib/python3.10/site-packages/h5py/_hl/attrs.py", line 56, in getitem
attr = h5a.open(self._id, self._e(name))
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "h5py/h5a.pyx", line 80, in h5py.h5a.open
KeyError: "Unable to synchronously open attribute (can't locate attribute: 'patch_level')"

What can I do to avoid this and have both scripts running normally?