Yussef93 / FewShotCellSegmentation

Code of "Few-shot microscopy image cell segmentation " https://link.springer.com/chapter/10.1007/978-3-030-67670-4_9
MIT License
21 stars 6 forks source link

Not able to reproduce results #6

Open EvgeniaChroni opened 1 year ago

EvgeniaChroni commented 1 year ago

Hello,

thank you for your work. The code unfortunately does not work because of inconsistencies in args and some variables names. I tried to fix it and reproduce the results for TNBC but the results that I get for 5-hot is 0.11 IOU with BCE, which is 0.10 points less to what you report. Could you please help me reproduce the results?

Yussef93 commented 1 year ago

Hi,

Please remain in this thread as your thread in the other repository creates confusion, since it is a different repo, thank you.

However, I have read your reply, you mentioned that you do preprocessing and then run Learning_main.py. After running Learning_main do you fine-tune the model ? if yes, what parameters do you use ?

EvgeniaChroni commented 1 year ago

Thank you very much for your response I really appreciate your help.

After training the model I run Evaluation_main.py by using the parameters mentioned below.

` def addEvaluationArgs():

parser = argparse.ArgumentParser(description="Evaluation Arguments")
parser.add_argument("--lr-method",type=str,default='Meta_Learning',help="Enter Meta_Learning or Supervised_Learning")
parser.add_argument("--finetune", type=int, default=1)
parser.add_argument("--testfinetune", type=int, default=1)
parser.add_argument("--affine", type=int, default=0)
parser.add_argument("--switchaffine", type=int, default=0)
parser.add_argument("--targets",type=str,nargs="*",default=[ 'TNBC'],
                    help="Combination of B5,B39,TNBC,ssTEM,EM")
parser.add_argument("--architect",type=str,default='FCRN',help="Enter FCRN or UNet")
parser.add_argument("--eval-meta-train-losses",type=str,nargs="*",default=['BCE'],
                                                                        #    'BCE_Entropy', 'BCE_Distillation', 'Combined'],
                    help="Combination of BCE,BCE_Entropy,BCE_Distillation,Combined")
parser.add_argument("--eval-selections",type=int,nargs="*",default=list(range(1,11)),
                    help="Up to 10 selections")
parser.add_argument("--selections",type=int,nargs="*",default=list(range(1,11)),
                    help="Up to 10 selections")    
parser.add_argument("--meta-lr", type=float, default=0.0001,
                    help="Pre-trained meta step size")

parser.add_argument("--lr", type=float, default=0.001,
                    help="Pre-trained learning rate")

parser.add_argument("--metamethods",type=str,nargs="*",default=['BCE'],
                    help="Combination of BCE,BCE_Entropy,BCE_Distillation,Combined")

parser.add_argument("--finetune-lr", type=float, default=0.1,
                    help="Finetune learning rate")
parser.add_argument("--finetune-loss", type=str, default="bce",
                    help="Binary Cross entropy Loss (BCE) function or Weighted BCE (weightedbce)")
parser.add_argument('--meta-epochs', type=int, default=300)
parser.add_argument('--inner_epochs', type=int, default=20)
parser.add_argument('--finetune-epochs', type=int, default=20)
parser.add_argument('--statedictepoch', type=int, default=None,help="Load saved parameters from pre-training epoch #")
parser.add_argument('--numshots', type=int,nargs="*",default=[1])
parser.add_argument("--pretrained-name", type=str, default='',
                    help="model name to be finetuned and evaluated")

parser.add_argument("--finetune-name", type=str, default='',
                    help="finetuned model name")
return parser

`