Closed kljjiajia closed 1 year ago
这个的意思是你按了停止训练: Detected KeyboardInterrupt, attempting graceful shutdown...
这个的意思是你按了停止训练:
Detected KeyboardInterrupt, attempting graceful shutdown...
/content/env/envs/fish_diffusion/lib/python3.10/site-packages/torch/functional.py:641: UserWarning: ComplexHalf support is experimental and many operators don't support it yet. (Triggered internally at ../aten/src/ATen/EmptyTensor.cpp:31.)
return _VF.stft(input, n_fft, hop_length, win_length, window, # type: ignore[attr-defined]
/content/env/envs/fish_diffusion/lib/python3.10/site-packages/torch/optim/lr_scheduler.py:139: UserWarning: Detected call of lr_scheduler.step()
before optimizer.step()
. In PyTorch 1.1.0 and later, you should call them in the opposite order: optimizer.step()
before lr_scheduler.step()
. Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate
warnings.warn("Detected call of lr_scheduler.step()
before optimizer.step()
.
第一个 warning 不用管, 第二个我看看, 我记得不影响训练的. 你现在是训练卡住了?
第一个 warning 不用管, 第二个我看看, 我记得不影响训练的. 你现在是训练卡住了?
有Epoch 但是没进度
看起来就是被终止了啊.. 你确定你配额还够?
看起来就是被终止了啊.. 你确定你配额还够?
我刚开始用 之前都没开colab
但是这个报错就是这个意思啊.. 额...
/content/env/envs/fish_diffusion/lib/python3.10/site-packages/pytorch_lightning/trainer/trainer.py:1609: PossibleUserWarning: The number of training batches (6) is smaller than the logging interval Trainer(log_every_n_steps=10). Set a lower value for log_every_n_steps if you want to see logs for the training epoch. rank_zero_warn( Epoch 0: 0% 0/6 [00:00<?, ?it/s] /content/env/envs/fish_diffusion/lib/python3.10/site-packages/torch/functional.py:641: UserWarning: ComplexHalf support is experimental and many operators don't support it yet. (Triggered internally at ../aten/src/ATen/EmptyTensor.cpp:31.) return _VF.stft(input, n_fft, hop_length, win_length, window, # type: ignore[attr-defined] /content/env/envs/fish_diffusion/lib/python3.10/site-packages/torch/optim/lr_scheduler.py:139: UserWarning: Detected call of
lr_scheduler.step()
beforeoptimizer.step()
. In PyTorch 1.1.0 and later, you should call them in the opposite order:optimizer.step()
beforelr_scheduler.step()
. Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate warnings.warn("Detected call oflr_scheduler.step()
beforeoptimizer.step()
. " Epoch 18: 0% 0/6 [00:00<?, ?it/s, v_num=0, train_loss_disc_step=3.180, train_loss_gen_step=105.0, train_loss_disc_epoch=3.640, train_loss_gen_epoch=107.0]/content/env/envs/fish_diffusion/lib/python3.10/site-packages/pytorch_lightning/trainer/call.py:48: UserWarning: Detected KeyboardInterrupt, attempting graceful shutdown... rank_zero_warn("Detected KeyboardInterrupt, attempting graceful shutdown...")该如何解决呢