MannLabs / alphadia

modular & open DIA search
https://alphadia.readthedocs.io
Apache License 2.0
41 stars 3 forks source link

Update learning rate scheduler API #248

Closed GeorgWa closed 2 months ago

GeorgWa commented 3 months ago

WHO alphaDIA User

WHAT Update transfer learning to new API

WHY Warning messages, potential breaks in the future

Acceptance Criteria No more warning messages :D

Additional information

0:12:48.068767 PROGRESS:  Ms2 model tested on validation dataset with the following metrics:
0:12:48.069153 PROGRESS:  l1_loss                       : 0.0411
0:12:48.069349 PROGRESS:  PCC-mean                      : 0.5771
0:12:48.069501 PROGRESS:  COS-mean                      : 0.6112
0:12:48.069706 PROGRESS:  SA-mean                       : 0.4187
0:12:48.069860 PROGRESS:  SPC-mean                      : 0.4531
0:12:48.071355 PROGRESS:  Fine-tuning MS2 model
[/Users/georgwallmann/miniconda3/envs/alpha/lib/python3.11/site-packages/torch/optim/lr_scheduler.py:28](https://file+.vscode-resource.vscode-cdn.net/Users/georgwallmann/miniconda3/envs/alpha/lib/python3.11/site-packages/torch/optim/lr_scheduler.py:28): UserWarning: The verbose parameter is deprecated. Please use get_last_lr() to access the learning rate.
  warnings.warn("The verbose parameter is deprecated. Please use get_last_lr() "
[/Users/georgwallmann/miniconda3/envs/alpha/lib/python3.11/site-packages/torch/optim/lr_scheduler.py:156](https://file+.vscode-resource.vscode-cdn.net/Users/georgwallmann/miniconda3/envs/alpha/lib/python3.11/site-packages/torch/optim/lr_scheduler.py:156): UserWarning: The epoch parameter in `scheduler.step()` was not necessary and is being deprecated where possible. Please use `scheduler.step()` to step the scheduler. During the deprecation, if epoch is different from None, the closed form is used instead of the new chainable form, where available. Please open an issue if you are unable to replicate your use case: https://github.com/pytorch/pytorch/issues/new/choose.
  warnings.warn(EPOCH_DEPRECATION_WARNING, UserWarning)
[/Users/georgwallmann/Documents/git/alphadia/alphadia/transferlearning/train.py:144](https://file+.vscode-resource.vscode-cdn.net/Users/georgwallmann/Documents/git/alphadia/alphadia/transferlearning/train.py:144): RuntimeWarning: invalid value encountered in scalar divide
  or abs(val_loss - self.last_loss) [/](https://file+.vscode-resource.vscode-cdn.net/) self.last_loss < self.margin
0:15:06.326629 PROGRESS:  Epoch 0   Lr: 0.00010   Training loss: 0.0307   validation loss: 0.0273
[/Users/georgwallmann/miniconda3/envs/alpha/lib/python3.11/site-packages/torch/optim/lr_scheduler.py:156](https://file+.vscode-resource.vscode-cdn.net/Users/georgwallmann/miniconda3/envs/alpha/lib/python3.11/site-packages/torch/optim/lr_scheduler.py:156): UserWarning: The epoch parameter in `scheduler.step()` was not necessary and is being deprecated where possible. Please use `scheduler.step()` to step the scheduler. During the deprecation, if epoch is different from None, the closed form is used instead of the new chainable form, where available. Please open an issue if you are unable to replicate your use case: https://github.com/pytorch/pytorch/issues/new/choose.
  warnings.warn(EPOCH_DEPRECATION_WARNING, UserWarning)
0:15:26.603096 PROGRESS:  Epoch 1   Lr: 0.00020   Training loss: 0.0274   validation loss: 0.0267
[/Users/georgwallmann/miniconda3/envs/alpha/lib/python3.11/site-packages/torch/optim/lr_scheduler.py:156](https://file+.vscode-resource.vscode-cdn.net/Users/georgwallmann/miniconda3/envs/alpha/lib/python3.11/site-packages/torch/optim/lr_scheduler.py:156): UserWarning: The epoch parameter in `scheduler.step()` was not necessary and is being deprecated where possible. Please use `scheduler.step()` to step the scheduler. During the deprecation, if epoch is different from None, the closed form is used instead of the new chainable form, where available. Please open an issue if you are unable to replicate your use case: https://github.com/pytorch/pytorch/issues/new/choose.
  warnings.warn(EPOCH_DEPRECATION_WARNING, UserWarning)
0:15:50.858958 PROGRESS:  Epoch 2   Lr: 0.00030   Training loss: 0.0268   validation loss: 0.0266
[/Users/georgwallmann/miniconda3/envs/alpha/lib/python3.11/site-packages/torch/optim/lr_scheduler.py:156](https://file+.vscode-resource.vscode-cdn.net/Users/georgwallmann/miniconda3/envs/alpha/lib/python3.11/site-packages/torch/optim/lr_scheduler.py:156): UserWarning: The epoch parameter in `scheduler.step()` was not necessary and is being deprecated where possible. Please use `scheduler.step()` to step the scheduler. During the deprecation, if epoch is different from None, the closed form is used instead of the new chainable form, where available. Please open an issue if you are unable to replicate your use case: https://github.com/pytorch/pytorch/issues/new/choose.
  warnings.warn(EPOCH_DEPRECATION_WARNING, UserWarning)
0:16:15.100357 PROGRESS:  Epoch 3   Lr: 0.00040   Training loss: 0.0273   validation loss: 0.0264
[/Users/georgwallmann/miniconda3/envs/alpha/lib/python3.11/site-packages/torch/optim/lr_scheduler.py:156](https://file+.vscode-resource.vscode-cdn.net/Users/georgwallmann/miniconda3/envs/alpha/lib/python3.11/site-packages/torch/optim/lr_scheduler.py:156): UserWarning: The epoch parameter in `scheduler.step()` was not necessary and is being deprecated where possible. Please use `scheduler.step()` to step the scheduler. During the deprecation, if epoch is different from None, the closed form is used instead of the new chainable form, where available. Please open an issue if you are unable to replicate your use case: https://github.com/pytorch/pytorch/issues/new/choose.
  warnings.warn(EPOCH_DEPRECATION_WARNING, UserWarning)
0:16:38.779689 PROGRESS:  Epoch 4   Lr: 0.00050   Training loss: 0.0267   validation loss: 0.0269