Open dara1400 opened 6 months ago
First, import the logging module:
import logging
Then, add the following line to suppress the message:
logging.getLogger("lightning.pytorch.utilities.rank_zero").setLevel(logging.WARNING)
This will prevent the following output from being printed:
GPU available: True (cuda), used: True
TPU available: False, using: 0 TPU cores
IPU available: False, using: 0 IPUs
HPU available: False, using: 0 HPUs
Next, add this line to suppress another message:
logging.getLogger("lightning.pytorch.accelerators.cuda").setLevel(logging.WARNING)
This will stop the following output from appearing:
LOCAL_RANK: 0 - CUDA_VISIBLE_DEVICES: [0]
Good luck!
Related: https://github.com/sktime/sktime/pull/6891 - should we perhaps address this at the source, and add a verbosity option, @XinyuWuu?
Related: sktime/sktime#6891 - should we perhaps address this at the source, and add a verbosity option, @XinyuWuu?
Good idea.
I want to use model.predict in a loop. It keeps printing this:
GPU available: True (cuda), used: True TPU available: False, using: 0 TPU cores IPU available: False, using: 0 IPUs HPU available: False, using: 0 HPUs LOCAL_RANK: 0 - CUDA_VISIBLE_DEVICES: [0]
Is there way to stop it from printing?