Below is the training log. The error shows: The provided lr scheduler StepLR doesn't follow PyTorch's LRScheduler API. You should override the LightningModule.lr_scheduler_step hook with your own logic if you are using a custom LR scheduler.
[2024-07-04 20:05:01,066][src.utils.utils][INFO] - Enforcing tags!
[2024-07-04 20:05:01,084][src.utils.utils][INFO] - Printing config tree with Rich!
[2024-07-04 20:05:02,466][main][INFO] - Instantiating datamodule
[2024-07-04 20:05:04,850][main][INFO] - Instantiating model
[2024-07-04 20:05:41,272][main][INFO] - Instantiating callbacks...
[2024-07-04 20:05:41,273][src.utils.utils][INFO] - Instantiating callback
[2024-07-04 20:05:41,284][src.utils.utils][INFO] - Instantiating callback
[2024-07-04 20:05:41,286][src.utils.utils][INFO] - Instantiating callback
[2024-07-04 20:05:41,287][main][INFO] - Instantiating loggers...
[2024-07-04 20:05:41,288][src.utils.utils][INFO] - Instantiating logger
[2024-07-04 20:05:41,293][src.utils.utils][INFO] - Instantiating logger
[2024-07-04 20:05:41,298][main][INFO] - Instantiating trainer
[2024-07-04 20:05:41,316][main][INFO] - Logging hyperparameters!
[2024-07-04 20:05:41,371][main][INFO] - Starting training!
[2024-07-04 20:05:41,700][root][INFO] - Train dataset loaded, size: 1543
[2024-07-04 20:05:41,914][root][INFO] - Validate dataset loaded, size: 398
[2024-07-04 20:05:42,152][src.utils.utils][ERROR] -
Traceback (most recent call last):
File "/home/jd/PointCompletion/CasFusionNet/src/utils/utils.py", line 38, in wrap
metric_dict, object_dict = task_func(cfg=cfg)
File "/home/jd/PointCompletion/CasFusionNet/src/train.py", line 84, in train
trainer.fit(model=model, datamodule=datamodule, ckpt_path=cfg.get("ckpt_path"))
File "/home/jd/miniconda3/envs/pointcomplt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 770, in fit
self._call_and_handle_interrupt(
File "/home/jd/miniconda3/envs/pointcomplt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 723, in _call_and_handle_interrupt
return trainer_fn(*args, **kwargs)
File "/home/jd/miniconda3/envs/pointcomplt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 811, in _fit_impl
results = self._run(model, ckpt_path=self.ckpt_path)
File "/home/jd/miniconda3/envs/pointcomplt/lib/python3.8/site-packages/pytorch_lightning/trainer/trainer.py", line 1217, in _run
self.strategy.setup(self)
File "/home/jd/miniconda3/envs/pointcomplt/lib/python3.8/site-packages/pytorch_lightning/strategies/single_device.py", line 72, in setup
super().setup(trainer)
File "/home/jd/miniconda3/envs/pointcomplt/lib/python3.8/site-packages/pytorch_lightning/strategies/strategy.py", line 139, in setup
self.setup_optimizers(trainer)
File "/home/jd/miniconda3/envs/pointcomplt/lib/python3.8/site-packages/pytorch_lightning/strategies/strategy.py", line 128, in setup_optimizers
self.optimizers, self.lr_scheduler_configs, self.optimizer_frequencies = _init_optimizers_and_lr_schedulers(
File "/home/jd/miniconda3/envs/pointcomplt/lib/python3.8/site-packages/pytorch_lightning/core/optimizer.py", line 195, in _init_optimizers_and_lr_schedulers
_validate_scheduler_api(lr_scheduler_configs, model)
File "/home/jd/miniconda3/envs/pointcomplt/lib/python3.8/site-packages/pytorch_lightning/core/optimizer.py", line 350, in _validate_scheduler_api
raise MisconfigurationException(
pytorch_lightning.utilities.exceptions.MisconfigurationException: The provided lr scheduler StepLR doesn't follow PyTorch's LRScheduler API. You should override the LightningModule.lr_scheduler_step hook with your own logic if you are using a custom LR scheduler.
[2024-07-04 20:05:42,156][src.utils.utils][INFO] - Output dir: /home/jd/PointCompletion/CasFusionNet/logs/train_ssc_pc/runs/2024-07-04_20-05-00
[2024-07-04 20:05:42,156][src.utils.utils][INFO] - Closing loggers...
I checked the error place:
if not isinstance(scheduler, LRSchedulerTypeTuple) and not is_overridden("lr_scheduler_step", model): raise MisconfigurationException( f"The provided lr scheduler{scheduler.class.name}doesn't follow PyTorch's LRScheduler" " API. You should override theLightningModule.lr_scheduler_stephook with your own logic if" " you are using a custom LR scheduler." )
The type of scheduler is StepLR. However, the LRSchedulerTypeTuple has only two types in the torch_lightning: LRSchedulerTypeTuple = (torch.optim.lr_scheduler._LRScheduler, torch.optim.lr_scheduler.ReduceLROnPlateau). That's why the error raised.
Below is the training log. The error shows: The provided lr scheduler
StepLR
doesn't follow PyTorch's LRScheduler API. You should override theLightningModule.lr_scheduler_step
hook with your own logic if you are using a custom LR scheduler.I checked the error place:
if not isinstance(scheduler, LRSchedulerTypeTuple) and not is_overridden("lr_scheduler_step", model): raise MisconfigurationException( f"The provided lr scheduler
{scheduler.class.name}doesn't follow PyTorch's LRScheduler" " API. You should override the
LightningModule.lr_scheduler_stephook with your own logic if" " you are using a custom LR scheduler." )
The type of
scheduler
is StepLR. However, theLRSchedulerTypeTuple
has only two types in the torch_lightning: LRSchedulerTypeTuple = (torch.optim.lr_scheduler._LRScheduler, torch.optim.lr_scheduler.ReduceLROnPlateau). That's why the error raised.How could I solve this problem?