mobaidoctor / med-ddpm

GNU General Public License v3.0
145 stars 16 forks source link

Model does not converge even after 50K iterations #37

Open balarama86 opened 1 month ago

balarama86 commented 1 month ago

Hi, thanks for the great work! I am training the model with my own brain vessel dataset, I did all the segmentation masks correctly. I changed the 'num_channels' parameter from 64 to 32 due to resource constraints. Thre result samples I am seeing are very noisy even after 50K iterations and the loss does not go below 0.15. Also the noisy result does not pick up the brain features such as thalumus, corpus collosom, etc., the output sample shows only vessels and remaining brain area is filled with gray noises. Is there any way to fine tune this model to generate brain vessel dataset.

Thanks!

mobaidoctor commented 3 weeks ago

Thank you for your interest in our work and for your inquiry. Are you training on a single modality or multiple modalities at the same time? If you are working with a single modality, we highly recommend keeping the input channels at 64. Based on our experimental observations, reducing the number of input channels can result in blurry outputs, indicating that the model may not be learning enough from specific feature details. Therefore, 64 input channels are recommended. If your model has not converged after 50K iterations, how many data samples do you have in your training dataset? If you have more than 1,000 images, we suggest training your model for at least 100K iterations (about 100 epochs). If your loss plateaus and no longer decreases, consider fine-tuning your model with a hybrid loss function, such as a combination of L1 and L2 losses with different weights. Also, adjust your learning rate accordingly. If your model outputs are too noisy, consider changing the clamp scale range to (0, 1) instead of (-1, 1) in Trainer.py. Different hyperparameter setups can help your model converge faster and better suit the specific requirements of your dataset. If you have any further questions, please don't hesitate to contact us.