dreamquark-ai / tabnet

PyTorch implementation of TabNet paper : https://arxiv.org/pdf/1908.07442.pdf
https://dreamquark-ai.github.io/tabnet/
MIT License
2.65k stars 488 forks source link

Stop logging during training #558

Open jonasboh opened 2 months ago

jonasboh commented 2 months ago

Feature request

What is the expected behavior? Train Tabnet without logging epoch, loss etc.

What is motivation or use case for adding/changing the behavior? I would like to use it in a pipeline with a processbar to show the training state. Loggig the information during training increses the size of the log file and makes it very confusing.

How should this be implemented in your opinion? Give an option for logging on different level like Info, Warning, Error and let the use decide what gets printed.

Are you willing to work on this yourself? yes

Optimox commented 2 months ago

I guess you can play with the verbose parameter and set it for example to the maximum number of epochs. This would reduce the training logs to the first and last epochs...

jonasboh commented 2 months ago

Thank you for the suggestion. Unfortunately it is still confusion the log, as I am training quite a few models. Is there no option to turn the printing completely off?

Optimox commented 2 months ago

I am not sure actually, what happens if you set the verbosity to 0? or -1 ?

jonasboh commented 2 months ago

I set verbosity to 0 and to -1 but always get the first epoch. Is there no way to stop this?

Optimox commented 2 months ago

This comes from the History Callback : https://github.com/dreamquark-ai/tabnet/blob/2c0c4ebd2bb1cb639ea94ab4b11823bc49265588/pytorch_tabnet/callbacks.py#L176

You could change this to remove prints. You would have to change the source code, but it is an easy change.