Closed jdeschamps closed 2 months ago
All modified and coverable lines are covered by tests :white_check_mark:
Project coverage is 87.06%. Comparing base (
929b9b8
) to head (2963f62
).
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
I typed the transforms differently in the end, and added further refactoring and some error raising!
Description
Following https://github.com/CAREamics/careamics/issues/231, there is a lack of flexibility in the way users can specify the transforms in the config convenience functions. The choice to users comes down to using only the default augmentations or no augmentation at all.
In passing, I also added the
dataloader_params
parameter by request of students during the MBL course.This PR does a few things:
TRANSFORMS_UNION
type in its own module in order to find it more easily.augmentations
parameter that allows user to pass non-default augmentations.use_augmentations
.dataloader_params
to allow setting PyTorch dataloader parameters (as expected by the configuration).model_kwargs
tomodel_params
for clarity.Summary:
Changes Made
transform_union.py
to holdTRANSFORMS_UNION
.configuration_factory.py
, corresponding tests and the imports ofTRANSFORMS_UNION
.Related Issues
Breaking changes
Any call that uses the convenience functions together with the parameter
model_kwargs
.This PR will require an update to the doc and the examples.
Please ensure your PR meets the following requirements: