openvinotoolkit / anomalib

An anomaly detection library comprising state-of-the-art algorithms and features such as experiment management, hyper-parameter optimization, and edge inference.
https://anomalib.readthedocs.io/en/latest/
Apache License 2.0
3.4k stars 615 forks source link

[Bug]: Issue with `self.model_size` Initialization in `EfficientAD` Class #2157

Open jrusbo opened 1 week ago

jrusbo commented 1 week ago

Describe the bug

I encountered an issue while using the EfficientAD class in the Anomalib library. The self.model_size attribute is set to a string value ("small" or "medium") during initialization. This leads to a problem when the prepare_pretrained_model function is called, specifically at the line where self.model_size.value is accessed, since a string object has no value attribute.

Dataset

Folder

Model

Other (please specify in the field below)

Steps to reproduce the behavior

  1. Create a configuration file where all model parameters are specified.
  2. Initialize the EfficientAD class using the get_model function inside the `src/anomalib/models/init.py
  3. Call the fit function of the engine

OS information

OS information:

Expected behavior

self.model_size should be initialized as an instance of EfficientAdModelSize rather than a plain string. This will ensure that the value attribute is accessible.

I think it should look like this:

self.model_size = EfficientAdModelSize(model_size)

### Screenshots

_No response_

### Pip/GitHub

pip

### What version/branch did you use?

_No response_

### Configuration YAML

```yaml
model:
  class_path: anomalib.models.EfficientAd
  init_args:
    imagenet_dir: datasets/imagenette
    teacher_out_channels: 384
    model_size: small # options: [small, medium]
    lr: 0.0001
    weight_decay: 0.00001
    padding: false
    pad_maps: true # relevant for "padding: false", see EfficientAd in lightning_model.py

Logs

File "src\anomaly_detection\train_anomalib.py", line 89, in train
    engine.fit(model=model, datamodule=datamodule)
  File "venv2\Lib\site-packages\anomalib\engine\engine.py", line 540, in fit
    self.trainer.fit(model, train_dataloaders, val_dataloaders, datamodule, ckpt_path)
  File "venv2\Lib\site-packages\mlflow\utils\autologging_utils\safety.py", line 576, in safe_patch_function
    patch_function(call_original, *args, **kwargs)
  File "venv2\Lib\site-packages\mlflow\utils\autologging_utils\safety.py", line 249, in patch_with_managed_run
    result = patch_function(original, *args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "venv2\Lib\site-packages\mlflow\pytorch\_lightning_autolog.py", line 537, in patched_fit
    result = original(self, *args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "venv2\Lib\site-packages\mlflow\utils\autologging_utils\safety.py", line 557, in call_original
    return call_original_fn_with_event_logging(_original_fn, og_args, og_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "venv2\Lib\site-packages\mlflow\utils\autologging_utils\safety.py", line 492, in call_original_fn_with_event_logging
    original_fn_result = original_fn(*og_args, **og_kwargs)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "venv2\Lib\site-packages\mlflow\utils\autologging_utils\safety.py", line 554, in _original_fn
    original_result = original(*_og_args, **_og_kwargs)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "venv2\Lib\site-packages\lightning\pytorch\trainer\trainer.py", line 544, in fit
    call._call_and_handle_interrupt(
  File "venv2\Lib\site-packages\lightning\pytorch\trainer\call.py", line 44, in _call_and_handle_interrupt
    return trainer_fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "venv2\Lib\site-packages\lightning\pytorch\trainer\trainer.py", line 580, in _fit_impl
    self._run(model, ckpt_path=ckpt_path)
  File "venv2\Lib\site-packages\lightning\pytorch\trainer\trainer.py", line 987, in _run
    results = self._run_stage()
              ^^^^^^^^^^^^^^^^^
  File "venv2\Lib\site-packages\lightning\pytorch\trainer\trainer.py", line 1033, in _run_stage
    self.fit_loop.run()
  File "venv2\Lib\site-packages\lightning\pytorch\loops\fit_loop.py", line 201, in run
    self.on_run_start()
  File "venv2\Lib\site-packages\lightning\pytorch\loops\fit_loop.py", line 328, in on_run_start
    call._call_lightning_module_hook(trainer, "on_train_start")
  File "venv2\Lib\site-packages\lightning\pytorch\trainer\call.py", line 157, in _call_lightning_module_hook
    output = fn(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^
  File "venv2\Lib\site-packages\anomalib\models\image\efficient_ad\lightning_model.py", line 251, in on_train_start
    self.prepare_pretrained_model()
  File "venv2\Lib\site-packages\anomalib\models\image\efficient_ad\lightning_model.py", line 93, in prepare_pretrained_model
    pretrained_models_dir / "efficientad_pretrained_weights" / f"pretrained_teacher_{self.model_size.value}.pth"
                                                                                     ^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'str' object has no attribute 'value'

Code of Conduct

Gornoka commented 1 week ago

If I see it correctly this change would fix the issue:

        model_size_str = self.model_size.value if isinstance(self.model_size, EfficientAdModelSize) else self.model_size
        teacher_path = (
            pretrained_models_dir / "efficientad_pretrained_weights" / f"pretrained_teacher_{model_size_str}.pth"
        )

I think it would be cleaner to cast the param into the enum on init, this would also make the type hinting more intuitive.

CarlosNacher commented 3 days ago

Hi, set "S" / "M" instead of "small" / "medium" in config.yaml file

alexriedel1 commented 3 days ago

Hi, set "S" / "M" instead of "small" / "medium" in config.yaml file

This won't help as "S" or "M" are strings too..

CarlosNacher commented 3 days ago

Hi, set "S" / "M" instead of "small" / "medium" in config.yaml file

This won't help as "S" or "M" are strings too..

Totally, I misunderstood the Issue key question, sorry