openvinotoolkit / anomalib

An anomaly detection library comprising state-of-the-art algorithms and features such as experiment management, hyper-parameter optimization, and edge inference.
https://anomalib.readthedocs.io/en/latest/
Apache License 2.0
3.84k stars 681 forks source link

šŸž v1: Engine's training entrypoint runs validation and test without checking whether validation and tests contain samples. #1576

Open samet-akcay opened 10 months ago

samet-akcay commented 10 months ago

Describe the bug

Engine's training entrypoint runs validation and test without checking whether validation and tests contain samples.

Dataset

Folder

Model

PADiM

Steps to reproduce the behavior

  1. Create a data config file using the following:
    class_path: anomalib.data.Folder
    init_args:
    root: "datasets/hazelnut_toy"
    normal_dir: "good"
    abnormal_dir: "crack"
    mask_dir: "mask/crack"
    normalization: imagenet
    test_split_mode: NONE
    val_split_mode: NONE
    seed: null
  2. Save this config as normal.yaml.
  3. Train a model via the following command:
    anomalib train --model Padim --data normal.yaml

OS information

OS information:

Expected behavior

Ideally, the engine should check if the validation and tests sets contain any images before running the validation and test entrypoints.

Screenshots

No response

Pip/GitHub

GitHub

What version/branch did you use?

No response

Configuration YAML

class_path: anomalib.data.Folder
init_args:
  root: "datasets/hazelnut_toy"
  normal_dir: "good"
  abnormal_dir: "crack"
  mask_dir: "mask/crack"
  normalization: imagenet
  test_split_mode: NONE
  val_split_mode: NONE
  seed: null

Logs

The following error will appear:

ā•­ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ Traceback (most recent call last) ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā•®
ā”‚ /home/sakcay/.pyenv/versions/anomalib_v1/bin/anomalib:8 in <module>                              ā”‚
ā”‚                                                                                                  ā”‚
ā”‚   5 from anomalib.cli.cli import main                                                            ā”‚
ā”‚   6 if __name__ == '__main__':                                                                   ā”‚
ā”‚   7 ā”‚   sys.argv[0] = re.sub(r'(-script\.pyw|\.exe)?$', '', sys.argv[0])                         ā”‚
ā”‚ ā± 8 ā”‚   sys.exit(main())                                                                         ā”‚
ā”‚   9                                                                                              ā”‚
ā”‚                                                                                                  ā”‚
ā”‚ /home/sakcay/Projects/anomalib/src/anomalib/cli/cli.py:330 in main                               ā”‚
ā”‚                                                                                                  ā”‚
ā”‚   327 def main() -> None:                                                                        ā”‚
ā”‚   328 ā”‚   """Trainer via Anomalib CLI."""                                                        ā”‚
ā”‚   329 ā”‚   configure_logger()                                                                     ā”‚
ā”‚ ā± 330 ā”‚   AnomalibCLI()                                                                          ā”‚
ā”‚   331                                                                                            ā”‚
ā”‚   332                                                                                            ā”‚
ā”‚   333 if __name__ == "__main__":                                                                 ā”‚
ā”‚                                                                                                  ā”‚
ā”‚ /home/sakcay/Projects/anomalib/src/anomalib/cli/cli.py:61 in __init__                            ā”‚
ā”‚                                                                                                  ā”‚
ā”‚    58 ā”‚   ā”‚   run: bool = True,                                                                  ā”‚
ā”‚    59 ā”‚   ā”‚   auto_configure_optimizers: bool = True,                                            ā”‚
ā”‚    60 ā”‚   ) -> None:                                                                             ā”‚
ā”‚ ā±  61 ā”‚   ā”‚   super().__init__(                                                                  ā”‚
ā”‚    62 ā”‚   ā”‚   ā”‚   AnomalyModule,                                                                 ā”‚
ā”‚    63 ā”‚   ā”‚   ā”‚   AnomalibDataModule,                                                            ā”‚
ā”‚    64 ā”‚   ā”‚   ā”‚   save_config_callback,                                                          ā”‚
ā”‚                                                                                                  ā”‚
ā”‚ /home/sakcay/.pyenv/versions/3.11.6/envs/anomalib_v1/lib/python3.11/site-packages/lightning/pyto ā”‚
ā”‚ rch/cli.py:386 in __init__                                                                       ā”‚
ā”‚                                                                                                  ā”‚
ā”‚   383 ā”‚   ā”‚   self.instantiate_classes()                                                         ā”‚
ā”‚   384 ā”‚   ā”‚                                                                                      ā”‚
ā”‚   385 ā”‚   ā”‚   if self.subcommand is not None:                                                    ā”‚
ā”‚ ā± 386 ā”‚   ā”‚   ā”‚   self._run_subcommand(self.subcommand)                                          ā”‚
ā”‚   387 ā”‚                                                                                          ā”‚
ā”‚   388 ā”‚   def _setup_parser_kwargs(self, parser_kwargs: Dict[str, Any]) -> Tuple[Dict[str, Any   ā”‚
ā”‚   389 ā”‚   ā”‚   subcommand_names = self.subcommands().keys()                                       ā”‚
ā”‚                                                                                                  ā”‚
ā”‚ /home/sakcay/Projects/anomalib/src/anomalib/cli/cli.py:275 in _run_subcommand                    ā”‚
ā”‚                                                                                                  ā”‚
ā”‚   272 ā”‚   ā”‚   if self.config["subcommand"] in (*self.subcommands(), "train", "export"):          ā”‚
ā”‚   273 ā”‚   ā”‚   ā”‚   fn = getattr(self.engine, subcommand)                                          ā”‚
ā”‚   274 ā”‚   ā”‚   ā”‚   fn_kwargs = self._prepare_subcommand_kwargs(subcommand)                        ā”‚
ā”‚ ā± 275 ā”‚   ā”‚   ā”‚   fn(**fn_kwargs)                                                                ā”‚
ā”‚   276 ā”‚   ā”‚   else:                                                                              ā”‚
ā”‚   277 ā”‚   ā”‚   ā”‚   self.config_init = self.parser.instantiate_classes(self.config)                ā”‚
ā”‚   278 ā”‚   ā”‚   ā”‚   getattr(self, f"{subcommand}")()                                               ā”‚
ā”‚                                                                                                  ā”‚
ā”‚ /home/sakcay/Projects/anomalib/src/anomalib/engine/engine.py:450 in train                        ā”‚
ā”‚                                                                                                  ā”‚
ā”‚   447 ā”‚   ā”‚   """                                                                                ā”‚
ā”‚   448 ā”‚   ā”‚   self._setup_trainer(model)                                                         ā”‚
ā”‚   449 ā”‚   ā”‚   self._setup_dataset_task(train_dataloaders, val_dataloaders, test_dataloaders, d   ā”‚
ā”‚ ā± 450 ā”‚   ā”‚   self.trainer.fit(model, train_dataloaders, val_dataloaders, datamodule, ckpt_pat   ā”‚
ā”‚   451 ā”‚   ā”‚   self.trainer.test(model, test_dataloaders, ckpt_path=ckpt_path, datamodule=datam   ā”‚
ā”‚   452 ā”‚                                                                                          ā”‚
ā”‚   453 ā”‚   def export(                                                                            ā”‚
ā”‚                                                                                                  ā”‚
ā”‚ /home/sakcay/.pyenv/versions/3.11.6/envs/anomalib_v1/lib/python3.11/site-packages/lightning/pyto ā”‚
ā”‚ rch/trainer/trainer.py:544 in fit                                                                ā”‚
ā”‚                                                                                                  ā”‚
ā”‚    541 ā”‚   ā”‚   self.state.fn = TrainerFn.FITTING                                                 ā”‚
ā”‚    542 ā”‚   ā”‚   self.state.status = TrainerStatus.RUNNING                                         ā”‚
ā”‚    543 ā”‚   ā”‚   self.training = True                                                              ā”‚
ā”‚ ā±  544 ā”‚   ā”‚   call._call_and_handle_interrupt(                                                  ā”‚
ā”‚    545 ā”‚   ā”‚   ā”‚   self, self._fit_impl, model, train_dataloaders, val_dataloaders, datamodule,  ā”‚
ā”‚    546 ā”‚   ā”‚   )                                                                                 ā”‚
ā”‚    547                                                                                           ā”‚
ā”‚                                                                                                  ā”‚
ā”‚ /home/sakcay/.pyenv/versions/3.11.6/envs/anomalib_v1/lib/python3.11/site-packages/lightning/pyto ā”‚
ā”‚ rch/trainer/call.py:44 in _call_and_handle_interrupt                                             ā”‚
ā”‚                                                                                                  ā”‚
ā”‚    41 ā”‚   try:                                                                                   ā”‚
ā”‚    42 ā”‚   ā”‚   if trainer.strategy.launcher is not None:                                          ā”‚
ā”‚    43 ā”‚   ā”‚   ā”‚   return trainer.strategy.launcher.launch(trainer_fn, *args, trainer=trainer,    ā”‚
ā”‚ ā±  44 ā”‚   ā”‚   return trainer_fn(*args, **kwargs)                                                 ā”‚
ā”‚    45 ā”‚                                                                                          ā”‚
ā”‚    46 ā”‚   except _TunerExitException:                                                            ā”‚
ā”‚    47 ā”‚   ā”‚   _call_teardown_hook(trainer)                                                       ā”‚
ā”‚                                                                                                  ā”‚
ā”‚ /home/sakcay/.pyenv/versions/3.11.6/envs/anomalib_v1/lib/python3.11/site-packages/lightning/pyto ā”‚
ā”‚ rch/trainer/trainer.py:580 in _fit_impl                                                          ā”‚
ā”‚                                                                                                  ā”‚
ā”‚    577 ā”‚   ā”‚   ā”‚   model_provided=True,                                                          ā”‚
ā”‚    578 ā”‚   ā”‚   ā”‚   model_connected=self.lightning_module is not None,                            ā”‚
ā”‚    579 ā”‚   ā”‚   )                                                                                 ā”‚
ā”‚ ā±  580 ā”‚   ā”‚   self._run(model, ckpt_path=ckpt_path)                                             ā”‚
ā”‚    581 ā”‚   ā”‚                                                                                     ā”‚
ā”‚    582 ā”‚   ā”‚   assert self.state.stopped                                                         ā”‚
ā”‚    583 ā”‚   ā”‚   self.training = False                                                             ā”‚
ā”‚                                                                                                  ā”‚
ā”‚ /home/sakcay/.pyenv/versions/3.11.6/envs/anomalib_v1/lib/python3.11/site-packages/lightning/pyto ā”‚
ā”‚ rch/trainer/trainer.py:989 in _run                                                               ā”‚
ā”‚                                                                                                  ā”‚
ā”‚    986 ā”‚   ā”‚   # ----------------------------                                                    ā”‚
ā”‚    987 ā”‚   ā”‚   # RUN THE TRAINER                                                                 ā”‚
ā”‚    988 ā”‚   ā”‚   # ----------------------------                                                    ā”‚
ā”‚ ā±  989 ā”‚   ā”‚   results = self._run_stage()                                                       ā”‚
ā”‚    990 ā”‚   ā”‚                                                                                     ā”‚
ā”‚    991 ā”‚   ā”‚   # ----------------------------                                                    ā”‚
ā”‚    992 ā”‚   ā”‚   # POST-Training CLEAN UP                                                          ā”‚
ā”‚                                                                                                  ā”‚
ā”‚ /home/sakcay/.pyenv/versions/3.11.6/envs/anomalib_v1/lib/python3.11/site-packages/lightning/pyto ā”‚
ā”‚ rch/trainer/trainer.py:1035 in _run_stage                                                        ā”‚
ā”‚                                                                                                  ā”‚
ā”‚   1032 ā”‚   ā”‚   ā”‚   with isolate_rng():                                                           ā”‚
ā”‚   1033 ā”‚   ā”‚   ā”‚   ā”‚   self._run_sanity_check()                                                  ā”‚
ā”‚   1034 ā”‚   ā”‚   ā”‚   with torch.autograd.set_detect_anomaly(self._detect_anomaly):                 ā”‚
ā”‚ ā± 1035 ā”‚   ā”‚   ā”‚   ā”‚   self.fit_loop.run()                                                       ā”‚
ā”‚   1036 ā”‚   ā”‚   ā”‚   return None                                                                   ā”‚
ā”‚   1037 ā”‚   ā”‚   raise RuntimeError(f"Unexpected state {self.state}")                              ā”‚
ā”‚   1038                                                                                           ā”‚
ā”‚                                                                                                  ā”‚
ā”‚ /home/sakcay/.pyenv/versions/3.11.6/envs/anomalib_v1/lib/python3.11/site-packages/lightning/pyto ā”‚
ā”‚ rch/loops/fit_loop.py:198 in run                                                                 ā”‚
ā”‚                                                                                                  ā”‚
ā”‚   195 ā”‚   ā”‚   if self.skip:                                                                      ā”‚
ā”‚   196 ā”‚   ā”‚   ā”‚   return                                                                         ā”‚
ā”‚   197 ā”‚   ā”‚   self.reset()                                                                       ā”‚
ā”‚ ā± 198 ā”‚   ā”‚   self.on_run_start()                                                                ā”‚
ā”‚   199 ā”‚   ā”‚   while not self.done:                                                               ā”‚
ā”‚   200 ā”‚   ā”‚   ā”‚   try:                                                                           ā”‚
ā”‚   201 ā”‚   ā”‚   ā”‚   ā”‚   self.on_advance_start()                                                    ā”‚
ā”‚                                                                                                  ā”‚
ā”‚ /home/sakcay/.pyenv/versions/3.11.6/envs/anomalib_v1/lib/python3.11/site-packages/lightning/pyto ā”‚
ā”‚ rch/loops/fit_loop.py:320 in on_run_start                                                        ā”‚
ā”‚                                                                                                  ā”‚
ā”‚   317 ā”‚   ā”‚   # reload the evaluation dataloaders too for proper display in the progress bar     ā”‚
ā”‚   318 ā”‚   ā”‚   if self.epoch_loop._should_check_val_epoch() and trainer.val_dataloaders is None   ā”‚
ā”‚   319 ā”‚   ā”‚   ā”‚   trainer.validating = True                                                      ā”‚
ā”‚ ā± 320 ā”‚   ā”‚   ā”‚   self.epoch_loop.val_loop.setup_data()                                          ā”‚
ā”‚   321 ā”‚   ā”‚   ā”‚   trainer.training = True                                                        ā”‚
ā”‚   322 ā”‚   ā”‚                                                                                      ā”‚
ā”‚   323 ā”‚   ā”‚   call._call_callback_hooks(trainer, "on_train_start")                               ā”‚
ā”‚                                                                                                  ā”‚
ā”‚ /home/sakcay/.pyenv/versions/3.11.6/envs/anomalib_v1/lib/python3.11/site-packages/lightning/pyto ā”‚
ā”‚ rch/loops/evaluation_loop.py:165 in setup_data                                                   ā”‚
ā”‚                                                                                                  ā”‚
ā”‚   162 ā”‚   ā”‚                                                                                      ā”‚
ā”‚   163 ā”‚   ā”‚   stage = self._stage                                                                ā”‚
ā”‚   164 ā”‚   ā”‚   source = self._data_source                                                         ā”‚
ā”‚ ā± 165 ā”‚   ā”‚   dataloaders = _request_dataloader(source)                                          ā”‚
ā”‚   166 ā”‚   ā”‚   trainer.strategy.barrier(f"{stage.dataloader_prefix}_dataloader()")                ā”‚
ā”‚   167 ā”‚   ā”‚                                                                                      ā”‚
ā”‚   168 ā”‚   ā”‚   if not isinstance(dataloaders, CombinedLoader):                                    ā”‚
ā”‚                                                                                                  ā”‚
ā”‚ /home/sakcay/.pyenv/versions/3.11.6/envs/anomalib_v1/lib/python3.11/site-packages/lightning/pyto ā”‚
ā”‚ rch/trainer/connectors/data_connector.py:342 in _request_dataloader                              ā”‚
ā”‚                                                                                                  ā”‚
ā”‚   339 ā”‚   ā”‚   # attributes on the instance in case the dataloader needs to be re-instantiated    ā”‚
ā”‚   340 ā”‚   ā”‚   # Also, it records all attribute setting and deletion using patched `__setattr__   ā”‚
ā”‚   341 ā”‚   ā”‚   # methods so that the re-instantiated object is as close to the original as poss   ā”‚
ā”‚ ā± 342 ā”‚   ā”‚   return data_source.dataloader()                                                    ā”‚
ā”‚   343                                                                                            ā”‚
ā”‚   344                                                                                            ā”‚
ā”‚   345 @dataclass                                                                                 ā”‚
ā”‚                                                                                                  ā”‚
ā”‚ /home/sakcay/.pyenv/versions/3.11.6/envs/anomalib_v1/lib/python3.11/site-packages/lightning/pyto ā”‚
ā”‚ rch/trainer/connectors/data_connector.py:309 in dataloader                                       ā”‚
ā”‚                                                                                                  ā”‚
ā”‚   306 ā”‚   ā”‚   ā”‚   return call._call_lightning_module_hook(self.instance.trainer, self.name, pl   ā”‚
ā”‚   307 ā”‚   ā”‚   if isinstance(self.instance, pl.LightningDataModule):                              ā”‚
ā”‚   308 ā”‚   ā”‚   ā”‚   assert self.instance.trainer is not None                                       ā”‚
ā”‚ ā± 309 ā”‚   ā”‚   ā”‚   return call._call_lightning_datamodule_hook(self.instance.trainer, self.name   ā”‚
ā”‚   310 ā”‚   ā”‚   assert self.instance is not None                                                   ā”‚
ā”‚   311 ā”‚   ā”‚   return self.instance                                                               ā”‚
ā”‚   312                                                                                            ā”‚
ā”‚                                                                                                  ā”‚
ā”‚ /home/sakcay/.pyenv/versions/3.11.6/envs/anomalib_v1/lib/python3.11/site-packages/lightning/pyto ā”‚
ā”‚ rch/trainer/call.py:179 in _call_lightning_datamodule_hook                                       ā”‚
ā”‚                                                                                                  ā”‚
ā”‚   176 ā”‚   fn = getattr(trainer.datamodule, hook_name)                                            ā”‚
ā”‚   177 ā”‚   if callable(fn):                                                                       ā”‚
ā”‚   178 ā”‚   ā”‚   with trainer.profiler.profile(f"[LightningDataModule]{trainer.datamodule.__class   ā”‚
ā”‚ ā± 179 ā”‚   ā”‚   ā”‚   return fn(*args, **kwargs)                                                     ā”‚
ā”‚   180 ā”‚   return None                                                                            ā”‚
ā”‚   181                                                                                            ā”‚
ā”‚   182                                                                                            ā”‚
ā”‚                                                                                                  ā”‚
ā”‚ /home/sakcay/Projects/anomalib/src/anomalib/data/base/datamodule.py:196 in val_dataloader        ā”‚
ā”‚                                                                                                  ā”‚
ā”‚   193 ā”‚   def val_dataloader(self) -> EVAL_DATALOADERS:                                          ā”‚
ā”‚   194 ā”‚   ā”‚   """Get validation dataloader."""                                                   ā”‚
ā”‚   195 ā”‚   ā”‚   return DataLoader(                                                                 ā”‚
ā”‚ ā± 196 ā”‚   ā”‚   ā”‚   dataset=self.val_data,                                                         ā”‚
ā”‚   197 ā”‚   ā”‚   ā”‚   shuffle=False,                                                                 ā”‚
ā”‚   198 ā”‚   ā”‚   ā”‚   batch_size=self.eval_batch_size,                                               ā”‚
ā”‚   199 ā”‚   ā”‚   ā”‚   num_workers=self.num_workers,                                                  ā”‚
ā•°ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā”€ā•Æ
AttributeError: 'Folder' object has no attribute 'val_data'

Code of Conduct

JoannaCCJH commented 9 months ago

Hi! I would like to work on this issue!

samet-akcay commented 9 months ago

Sure, thanks for your interest!

JoannaCCJH commented 8 months ago

@samet-akcay @blaz-r Hi! I just wanna make sure I get the point right. I need to write code to check whether the validation and tests set contain samples above the following code segment. Here are the steps I need to take:

  1. check whether val_dataloader or datamodule.val_dataloader() contain samples
  2. check whether test_dataloaders or datamodule.test_dataloader() contain samples
  3. if they don't contain any samples, should I add any output messages or just skip executing self.trainer.validate() or self.trainer.test() ?
      self._setup_transform(model, datamodule=datamodule, ckpt_path=ckpt_path)
      if model.learning_type in [LearningType.ZERO_SHOT, LearningType.FEW_SHOT]:
          # if the model is zero-shot or few-shot, we only need to run validate for normalization and thresholding
          self.trainer.validate(model, val_dataloaders, None, verbose=False, datamodule=datamodule)
      else:
          self.trainer.fit(model, train_dataloaders, val_dataloaders, datamodule, ckpt_path)
      self.trainer.test(model, test_dataloaders, ckpt_path=ckpt_path, datamodule=datamodule)
JoannaCCJH commented 8 months ago

@samet-akcay @blaz-r

I added a few lines to check whether val_dataloaders or datamodule contain validation images and test images. Am I going on the right track, please?

        has_val_samples = True
        if not val_dataloaders and not hasattr(datamodule, "val_data"):
            has_val_samples = False
        has_test_samples = True
        if not test_dataloaders and not hasattr(datamodule, "test_data"):
            has_test_samples = False

        if model.learning_type in [LearningType.ZERO_SHOT, LearningType.FEW_SHOT]:
            # if the model is zero-shot or few-shot, we only need to run validate for normalization and thresholding
            self.trainer.validate(model, val_dataloaders, None, verbose=False, datamodule=datamodule)
        else:
            self.trainer.fit(model, train_dataloaders, val_dataloaders, datamodule, ckpt_path)
        self.trainer.test(model, test_dataloaders, ckpt_path=ckpt_path, datamodule=datamodule)
blaz-r commented 8 months ago

@JoannaCCJH I think this is going into the right direction.

JoannaCCJH commented 8 months ago

Hi! @samet-akcay @blaz-r I've made changes to the code provided, and the modified code is attached below. While the code works for the given example, I encountered some problems during the process.

  1. When calling the train method in the command line (anomalib train --model Padim --data normal.yaml), the data module (datamodule) wasn't being set up automatically. Adding the following lines resolved the error where 'Folder' object has no attribute 'train_data':

    if datamodule is not None:
            datamodule.setup()

    Question: I'm thinking whether setting up the data module at this point is appropriate or if there's a better place for it.

  2. Without adding the image_size: (256,256) argument to normal.yaml, the following error occurred:

    RuntimeError: stack expects each tensor to be equal size, but got [3, 512, 512] at entry 0 and [3, 522, 512] at entry 
    21

    Question: I also saw people having this problem. Should this be fixed?

  3. I'm unsure if my implementation is sufficiently clear and tidy.

     def train():
        .....

        if datamodule is not None:
            datamodule.setup()

        has_val_samples = True
        if not val_dataloaders and not hasattr(datamodule, "val_data"):
            has_val_samples = False
        has_test_samples = True
        if not test_dataloaders and not hasattr(datamodule, "test_data"):
            has_test_samples = False

        if model.learning_type in [LearningType.ZERO_SHOT, LearningType.FEW_SHOT] and has_val_samples:
            # if the model is zero-shot or few-shot, we only need to run validate for normalization and thresholding
            self.trainer.validate(model, val_dataloaders, None, verbose=False, datamodule=datamodule)
        else:
            if has_val_samples:
                self.trainer.fit(model, train_dataloaders, val_dataloaders, datamodule, ckpt_path)
            else:
                train_dataloader = train_dataloaders if train_dataloaders is not None else datamodule.train_dataloader()
                self.trainer.fit(model, train_dataloaders=train_dataloader, ckpt_path=ckpt_path)

        if has_test_samples:
            self.trainer.test(model, test_dataloaders, ckpt_path=ckpt_path, datamodule=datamodule)
JoannaCCJH commented 8 months ago
Screen Shot 2024-03-11 at 1 21 50 AM
samet-akcay commented 8 months ago

@djdameln, can you provide your insight here?