ray-project / ray_lightning

Pytorch Lightning Distributed Accelerators using Ray
Apache License 2.0
211 stars 34 forks source link

Bump pytorch-lightning from 1.6.4 to 1.7.2 #200

Closed dependabot[bot] closed 2 years ago

dependabot[bot] commented 2 years ago

Bumps pytorch-lightning from 1.6.4 to 1.7.2.

Release notes

Sourced from pytorch-lightning's releases.

PyTorch Lightning 1.7.2: Standard patch release

[1.7.2] - 2022-08-17

Added

  • Added FullyShardedNativeNativeMixedPrecisionPlugin to handle precision for DDPFullyShardedNativeStrategy (#14092)
  • Added profiling to these hooks: on_before_batch_transfer, transfer_batch_to_device, on_after_batch_transfer, configure_gradient_clipping, clip_gradients (#14069)

Changed

  • Updated compatibility for LightningLite to run with the latest DeepSpeed 0.7.0 (13967)
  • Raised a MisconfigurationException if batch transfer hooks are overriden with IPUAccelerator (13961)
  • The default project name in WandbLogger is now "lightning_logs" (#14145)
  • The WandbLogger.name property no longer returns the name of the experiment, and instead returns the project's name (#14145)

Fixed

  • Fixed a bug that caused spurious AttributeError when multiple DataLoader classes are imported (#14117)
  • Fixed epoch-end logging results not being reset after the end of the epoch (#14061)
  • Fixed saving hyperparameters in a composition where the parent class is not a LightningModule or LightningDataModule (#14151)
  • Fixed epoch-end logging results not being reset after the end of the epoch (#14061)
  • Fixed the device placement when LightningModule.cuda() gets called without specifying a device index and the current cuda device was not 0 (#14128)
  • Avoided false positive warning about using sync_dist when using torchmetrics (#14143)
  • Avoid metadata.entry_points deprecation warning on Python 3.10 (#14052)
  • Avoid raising the sampler warning if num_replicas=1 (#14097)
  • Fixed resuming from a checkpoint when using Stochastic Weight Averaging (SWA) (#9938)
  • Avoided requiring the FairScale package to use precision with the fsdp native strategy (#14092)
  • Fixed an issue in which the default name for a run in WandbLogger would be set to the project name instead of a randomly generated string (#14145)
  • Fixed not preserving set attributes on DataLoader and BatchSampler when instantiated inside *_dataloader hooks (#14212)

Contributors

@​adamreeve @​akihironitta @​awaelchli @​Borda @​carmocca @​dependabot @​otaj @​rohitgr7

PyTorch Lightning 1.7.1: Standard patch release

[1.7.1] - 2022-08-09

Fixed

  • Casted only floating point tensors to fp16 with IPUs (#13983)
  • Casted tensors to fp16 before moving them to device with DeepSpeedStrategy (#14000)
  • Fixed the NeptuneLogger dependency being unrecognized (#13988)
  • Fixed an issue where users would be warned about unset max_epochs even when fast_dev_run was set (#13262)
  • Fixed MPS device being unrecognized (#13992)
  • Fixed incorrect precision="mixed" being used with DeepSpeedStrategy and IPUStrategy (#14041)
  • Fixed dtype inference during gradient norm computation (#14051)
  • Fixed a bug that caused ddp_find_unused_parameters to be set False, whereas the intended default is True (#14095)

... (truncated)

Commits
  • 4fae327 Weekly patch release v1.7.2 (#14126)
  • be58159 Fix assert wandb Run when mode="disabled" (#14112)
  • 8cdc867 Fix incorrect precision="mixed" being used with DeepSpeedStrategy and `IP...
  • 12b06ed raise torchvision version to 0.13.0
  • b438fa5 Cast to fp16 before moving to device with deepspeed (#14000)
  • 0bdbf4d Update tqdm requirement from <=4.63.0,>=4.57.0 to >=4.57.0,<4.65.0 in /requir...
  • ba7f89c Update wandb requirement from <0.12.20,>=0.10.22 to >=0.10.22,<0.13.2 in /req...
  • 13c6cf7 Update mlflow requirement from <1.27.0,>=1.0.0 to >=1.0.0,<1.28.0 in /require...
  • ac50b0c Update comet-ml requirement from <3.31.6,>=3.1.12 to >=3.1.12,<3.31.8 in /req...
  • 30e062a Freeze requirements for CI (#14007)
  • Additional commits viewable in compare view


Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
dependabot[bot] commented 2 years ago

Superseded by #204.