KarahanS / Self-Supervised-Learning-for-Medical-Image-Analysis

SSL for Medical Image Analysis
1 stars 1 forks source link

Feature/configs #11

Closed kivanctezoren closed 5 months ago

kivanctezoren commented 5 months ago
KarahanS commented 5 months ago

Lgtm. We talked about enabling custom augmentations so that users can try out their own augmentation sequence. Do you have any idea how to incorporate that? I thought of adding "CUSTOM" flag to augmentations. If provided, we can display a list of augmentations to the user in the terminal and ask for a sequence of numbers - where each number corresponds to an augmentation. But adding the specific details is a little bit cumbersome - for example the contrast level of the color jitter. Additionally, such interactivity may not be practical (if the job is sent to a cluster for example, where there is limited access to the terminal)

Or, we can do that in the config file as well, where a user can input different augmentations (from a pre-specified list) to the yaml file. They can provide details or use the default ones. But there are huge number of augmentation methods out there, so maybe it is better to keep it separate.

Or, we don't provide that feature directly. A developer can add their augmentation sequence to augmentations.py to try out new combinations. If the first two approaches cause the configuration to be unnecessarily complicated, we should go for this one.

KarahanS commented 5 months ago

Btw, we can add a new task for it if it requires high effort. Let me know if you want to improve this PR or separate the task.

kivanctezoren commented 5 months ago

We talked about enabling custom augmentations so that users can try out their own augmentation sequence. Do you have any idea how to incorporate that?

Currently only predetermined augmentation sequences from the AugmentationSequenceType enum are accepted, however it is a TODO in config.py to accept custom lists of torchvision.transforms in the config. The config allows passing lists of arbitrary sizes and parameters, so we could incorporate something like this:

  train_transform:
    -
      transform_name: "pad"
      transform_params:
        fill: 0
        mode: "constant"
    -
      transform_name: "random_resized_crop"
      transform_params: None
    -
      transform_name: "random_horizontal_flip"
      transform_params: None
  test_transform:
    -
      transform_name: "center_crop"
      transform_params: None

Since passing a list is different than passing a string, this wouldn't affect the current feature either. We could use enums for transform_name in this example as well, but since there are so many, I think using the transformation class name itself as a string is a better option here.

I can work on this, if we agree to merge this branch then I can add the task (and actually, a couple more such as model checkpoint passing for Pretrained) and implement them. Or, if you do not require the config system immediately, I can submit another PR with these features implemented. In this case we can reject this PR for now.

KarahanS commented 5 months ago

Great, agreed. Let's make it another task on you.