effusiveperiscope / so-vits-svc

so-vits-svc
MIT License
179 stars 71 forks source link

Huge log mass when training a model #40

Closed TarvosKorp closed 1 year ago

TarvosKorp commented 1 year ago

Hello, I am cloud training 5 models at the same time on the same computer using Kaggle accounts using different browsers.

So, in the learning process, a block of text appears at each stage of each epoch and the list of the log grows very quickly. Although all calculations take place in the cloud, the session page itself is very loaded, and if you do not manually clear the list regularly, the session will most likely freeze and then crash... Is it possible to redo the reporting, for example, so that each new block overwrites the old one and eventually we have seen only the last stage of the last epoch, so to speak, the most up-to-date information?

This is a very significant problem and I really hope that it can be eliminated, in extreme cases, let the information about the stages of epochs not be written to the text log at all, I can monitor the process in the file manager.

Log

effusiveperiscope commented 1 year ago

I'm not familiar with how Kaggle handles logging. Do you know if there are any other things that can log in the manner you are suggesting on Kaggle?

TarvosKorp commented 1 year ago

Eh, no, I myself am very new to Kaggle, I expected to fork and change the code of the repository itself, since there is a minimum of control in the Kaggle notebook. However, I found a way out, Kaggle has the ability to display blocks as code and as plain text. So, when I make a training block in the form of text, no windows with logs appear and nothing loads the session. It can be said, albeit in a slightly roundabout way, I coped with this problem.

effusiveperiscope commented 1 year ago

OK