line / open-universe

Open implementation of UNIVERSE and UNIVERSE++ diffusion-based speech enhancement models.
Apache License 2.0
72 stars 9 forks source link

some warnings during training #3

Open MichaelChen147 opened 4 months ago

MichaelChen147 commented 4 months ago

Thanks for your great work! I have some questions.

/home/usr23/wenyichen/miniconda3/envs/open-universe/lib/python3.10/site-packages/torch/optim/lr_scheduler.py:139: UserWarning: Detected call of lr_scheduler.step() before optimizer.step(). In PyTorch 1.1.0 and later, you should call them in the opposite order: optimizer.step() before lr_scheduler.step(). Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate warnings.warn("Detected call of lr_scheduler.step() before optimizer.step(). " [W reducer.cpp:1300] Warning: find_unused_parameters=True was specified in DDP constructor, but did not find any unused parameters in the forward pass. This flag results in an extra traversal of the autograd graph every iteration, which can adversely affect performance. If your model indeed never has any unused parameters in the forward pass, consider turning this flag off. Note that this warning may be a false positive if your model has flow control causing later iterations to have unused parameters. (function operator()) /home/usr23/wenyichen/miniconda3/envs/open-universe/lib/python3.10/site-packages/torch/autograd/init.py:200: UserWarning: reflection_pad1d_backward_out_cuda does not have a deterministic implementation, but you set 'torch.use_deterministic_algorithms(True, warn_only=True)'. You can file an issue at https://github.com/pytorch/pytorch/issues to help us prioritize adding deterministic support for this operation. (Triggered internally at /opt/conda/conda-bld/pytorch_1682343967769/work/aten/src/ATen/Context.cpp:71.) Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass

Does anyone know how to solve this issue? And when I execute the command "nohup python ./train.py experiment=universepp_vb_16k > VBD.log 2>&1 &", there is no information. How to show it?