ray-project / ray_lightning

Pytorch Lightning Distributed Accelerators using Ray
Apache License 2.0
211 stars 34 forks source link

warning in the ci test (change the deprecated api) #172

Open JiahaoYao opened 2 years ago

JiahaoYao commented 2 years ago
../../../../home/codespace/.conda/envs/ci/lib/python3.7/site-packages/torch/utils/tensorboard/__init__.py:5
  /home/codespace/.conda/envs/ci/lib/python3.7/site-packages/torch/utils/tensorboard/__init__.py:5: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
    tensorboard.__version__

../../../../home/codespace/.conda/envs/ci/lib/python3.7/site-packages/torch/utils/tensorboard/__init__.py:6
  /home/codespace/.conda/envs/ci/lib/python3.7/site-packages/torch/utils/tensorboard/__init__.py:6: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
    ) < LooseVersion("1.15"):

ray_lightning/tests/test_ddp.py::test_actor_creation[1]
  /home/codespace/.conda/envs/ci/lib/python3.7/site-packages/torch/distributed/_sharded_tensor/__init__.py:10: DeprecationWarning: torch.distributed._sharded_tensor will be deprecated, use torch.distributed._shard.sharded_tensor instead
    DeprecationWarning

ray_lightning/tests/test_ddp.py: 25 warnings
  /home/codespace/.conda/envs/ci/lib/python3.7/site-packages/pytorch_lightning/trainer/connectors/callback_connector.py:152: LightningDeprecationWarning: Setting `Trainer(checkpoint_callback=True)` is deprecated in v1.5 and will be removed in v1.7. Please consider using `Trainer(enable_checkpointing=True)`.
    f"Setting `Trainer(checkpoint_callback={checkpoint_callback})` is deprecated in v1.5 and will "

ray_lightning/tests/test_ddp.py::test_predict[1]
ray_lightning/tests/test_ddp.py::test_predict[2]
ray_lightning/tests/test_ddp.py::test_predict_client[1]
ray_lightning/tests/test_ddp.py::test_predict_client[2]
  /home/codespace/.conda/envs/ci/lib/python3.7/site-packages/pytorch_lightning/core/datamodule.py:127: LightningDeprecationWarning: DataModule property `test_transforms` was deprecated in v1.5 and will be removed in v1.7.
    "DataModule property `test_transforms` was deprecated in v1.5 and will be removed in v1.7."

ray_lightning/tests/test_ddp.py::test_early_stop
  /home/codespace/.conda/envs/ci/lib/python3.7/site-packages/pytorch_lightning/trainer/connectors/callback_connector.py:97: LightningDeprecationWarning: Setting `Trainer(progress_bar_refresh_rate=1)` is deprecated in v1.5 and will be removed in v1.7. Please pass `pytorch_lightning.callbacks.progress.TQDMProgressBar` with `refresh_rate` directly to the Trainer's `callbacks` argument instead. Or, to disable the progress bar pass `enable_progress_bar = False` to the Trainer.
    f"Setting `Trainer(progress_bar_refresh_rate={progress_bar_refresh_rate})` is deprecated in v1.5 and"

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
JiahaoYao commented 2 years ago
../../../../home/codespace/.conda/envs/ci/lib/python3.7/site-packages/torch/utils/tensorboard/__init__.py:5
  /home/codespace/.conda/envs/ci/lib/python3.7/site-packages/torch/utils/tensorboard/__init__.py:5: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
    tensorboard.__version__

../../../../home/codespace/.conda/envs/ci/lib/python3.7/site-packages/torch/utils/tensorboard/__init__.py:6
  /home/codespace/.conda/envs/ci/lib/python3.7/site-packages/torch/utils/tensorboard/__init__.py:6: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
    ) < LooseVersion("1.15"):

ray_lightning/tests/test_ddp_sharded.py::test_ddp_sharded_plugin_checkpoint
  /home/codespace/.conda/envs/ci/lib/python3.7/site-packages/torch/distributed/_sharded_tensor/__init__.py:10: DeprecationWarning: torch.distributed._sharded_tensor will be deprecated, use torch.distributed._shard.sharded_tensor instead
    DeprecationWarning

ray_lightning/tests/test_ddp_sharded.py::test_ddp_sharded_plugin_checkpoint
ray_lightning/tests/test_ddp_sharded.py::test_ddp_sharded_plugin_finetune
ray_lightning/tests/test_ddp_sharded.py::test_ddp_sharded_plugin_resume_from_checkpoint
ray_lightning/tests/test_ddp_sharded.py::test_ddp_sharded_plugin_test
ray_lightning/tests/test_ddp_sharded.py::test_ddp_sharded_plugin_resume_from_checkpoint_downsize
  /home/codespace/.conda/envs/ci/lib/python3.7/site-packages/pytorch_lightning/loops/utilities.py:94: PossibleUserWarning: `max_epochs` was not set. Setting it to 1000 epochs. To train without an epoch limit, set `max_epochs=-1`.
    category=PossibleUserWarning,

ray_lightning/tests/test_ddp_sharded.py::test_ddp_sharded_plugin_finetune
  /home/codespace/.conda/envs/ci/lib/python3.7/site-packages/pytorch_lightning/trainer/connectors/data_connector.py:245: PossibleUserWarning: The dataloader, train_dataloader, does not have many workers which may be a bottleneck. Consider increasing the value of the `num_workers` argument` (try 4 which is the number of cpus on this machine) in the `DataLoader` init to improve performance.
    category=PossibleUserWarning,

ray_lightning/tests/test_ddp_sharded.py::test_ddp_sharded_plugin_finetune
  /home/codespace/.conda/envs/ci/lib/python3.7/site-packages/pytorch_lightning/trainer/trainer.py:1937: PossibleUserWarning: The number of training batches (1) is smaller than the logging interval Trainer(log_every_n_steps=50). Set a lower value for log_every_n_steps if you want to see logs for the training epoch.
    category=PossibleUserWarning,

ray_lightning/tests/test_ddp_sharded.py::test_ddp_sharded_plugin_finetune
  /home/codespace/.conda/envs/ci/lib/python3.7/site-packages/pytorch_lightning/trainer/connectors/data_connector.py:245: PossibleUserWarning: The dataloader, val_dataloader 0, does not have many workers which may be a bottleneck. Consider increasing the value of the `num_workers` argument` (try 4 which is the number of cpus on this machine) in the `DataLoader` init to improve performance.
    category=PossibleUserWarning,

ray_lightning/tests/test_ddp_sharded.py::test_ddp_sharded_plugin_resume_from_checkpoint
ray_lightning/tests/test_ddp_sharded.py::test_ddp_sharded_plugin_resume_from_checkpoint_downsize
  /home/codespace/.conda/envs/ci/lib/python3.7/site-packages/pytorch_lightning/trainer/connectors/checkpoint_connector.py:52: LightningDeprecationWarning: Setting `Trainer(resume_from_checkpoint=)` is deprecated in v1.5 and will be removed in v1.7. Please pass `Trainer.fit(ckpt_path=)` directly instead.
    "Setting `Trainer(resume_from_checkpoint=)` is deprecated in v1.5 and"

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html