Open blaz-r opened 2 months ago
@djdameln any thoughts?
Is there any updates on this one?
@blaz-r, not to address this one specifically, but @djdameln is working on some changes, which might address this
Okay. I'll use the workaround for now and we'll see when those changes are added.
Describe the bug
Inside the engine code the trasnforms from the datamodule and dataloader are taken before the ones from the model: https://github.com/openvinotoolkit/anomalib/blob/2bd2842ec33c6eedb351d53cf1a1082069ff69dc/src/anomalib/engine/engine.py#L382-L398
This can cause problems if datamodule was never used inside the trainer. In that case looking at the following code, the transforms returned are just resize (since there is no trainer to take the correct model transforms from). This leads to missing the potential normalization from the model: https://github.com/openvinotoolkit/anomalib/blob/2bd2842ec33c6eedb351d53cf1a1082069ff69dc/src/anomalib/data/base/datamodule.py#L266-L277
This happens only if you are calling setup and dataloader from the datamodule outside trainer (as that is not set in this case) like this:
There is also code below to reproduce this.
A workaround I have at the moment is doing this:
This way the dataloader transforms are ignored and the ones from model are taken. Another possibility that I see is by manually setting the transforms of the datamodule with
model.configure_transforms(image_size)
.Dataset
N/A
Model
N/A
Steps to reproduce the behavior
OS information
OS information:
Expected behavior
I would expect that in this case, the model transforms would take priority over the ones in dataloader, but I see how this would cause trouble in case of custom transforms inside datamodule.
Screenshots
No response
Pip/GitHub
GitHub
What version/branch did you use?
1.2.0dev
Configuration YAML
Logs
Code of Conduct