HongxinXiang / ImageMol

ImageMol is a molecular image-based pre-training deep learning framework for computational drug discovery.
https://www.nature.com/articles/s42256-022-00557-6
MIT License
46 stars 26 forks source link

hyperparameter optimization #13

Closed SoodabehGhaffari closed 10 months ago

SoodabehGhaffari commented 1 year ago

Hello, I noticed finetune.py takes the inputs like learning rate, batch size, number of epochs (or use the default values if we do not specify them). I was wondering how we can optimize the hyperparameters for our own dataset.

Thank you

HongxinXiang commented 1 year ago

I suggest you can set the learning rate (5e-4 to 0.5), batch size (8 to 128), number of epochs (10 to 100) at equal intervals and perform a grid search under these hyperparameter settings. Considering efficiency, you can set a larger spacing initially to observe which parameter ranges the model prefers and further refine the hyperparameters.

SoodabehGhaffari commented 1 year ago

Could you please how I can combine the grid search with SGD used in finetune.py? What modifications should I make in the code below? optimizer = torch.optim.SGD( filter(lambda x: x.requires_grad, model.parameters()), lr=args.lr, momentum=args.momentum, weight_decay=10 ** args.weight_decay, ) if args.task_type == "classification": criterion = nn.BCEWithLogitsLoss(reduction="none") elif args.task_type == "regression": criterion = nn.MSELoss() else: raise Exception("param {} is not supported.".format(args.task_type))

HongxinXiang commented 1 year ago

You don't need to modify any code, you just need to simply modify the hyperparameters in finetune.py (such as --lr).