ray-project / ray_lightning

Pytorch Lightning Distributed Accelerators using Ray
Apache License 2.0
211 stars 34 forks source link

Bump pytorch-lightning from 1.6.4 to 1.7.5 #211

Closed dependabot[bot] closed 1 year ago

dependabot[bot] commented 1 year ago

Bumps pytorch-lightning from 1.6.4 to 1.7.5.

Release notes

Sourced from pytorch-lightning's releases.

PyTorch Lightning 1.7.5: Standard patch release

[1.7.5] - 2022-09-06

Fixed

  • Squeezed tensor values when logging with LightningModule.log (#14489)
  • Fixed WandbLogger save_dir is not set after creation (#14326)
  • Fixed Trainer.estimated_stepping_batches when maximum number of epochs is not set (#14317)

Contributors

@​carmocca @​dependabot @​robertomest @​rohitgr7 @​tshu-w

If we forgot someone due to not matching commit email with GitHub account, let us know :)

PyTorch Lightning 1.7.4: Standard patch release

[1.7.4] - 2022-08-31

Added

  • Added an environment variable PL_DISABLE_FORK that can be used to disable all forking in the Trainer (#14319)

Fixed

  • Fixed LightningDataModule hparams parsing (#12806)
  • Reset epoch progress with batch size scaler (#13846)
  • Fixed restoring the trainer after using lr_find() so that the correct LR schedule is used for the actual training (#14113)
  • Fixed incorrect values after transferring data to an MPS device (#14368)

Contributors

@​rohitgr7 @​tanmoyio @​justusschock @​cschell @​carmocca @​Callidior @​awaelchli @​j0rd1smit @​dependabot @​Borda @​otaj

PyTorch Lightning 1.7.3: Standard patch release

[1.7.3] - 2022-08-25

Fixed

  • Fixed an assertion error when using a ReduceOnPlateau scheduler with the Horovod strategy (#14215)
  • Fixed an AttributeError when accessing LightningModule.logger and the Trainer has multiple loggers (#14234)
  • Fixed wrong num padding for RichProgressBar (#14296)
  • Added back support for logging in the configure_gradient_clipping hook after unintended removal in v1.7.2 (#14298)
  • Fixed an issue to avoid the impact of sanity check on reload_dataloaders_every_n_epochs for validation (#13964)

Contributors

@​awaelchli @​Borda @​carmocca @​dependabot @​kaushikb11 @​otaj @​rohitgr7

PyTorch Lightning 1.7.2: Standard patch release

... (truncated)

Commits


Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
dependabot[bot] commented 1 year ago

Superseded by #215.