stared / livelossplot

Live training loss plot in Jupyter Notebook for Keras, PyTorch and others
https://p.migdal.pl/livelossplot
MIT License
1.29k stars 143 forks source link

Adding `from_step` option to leave the first entries #130

Closed stared closed 3 years ago

stared commented 3 years ago

It intends to solve #124 - as very often the first epochs have very large (and largely not meaningful) values. The idea is to use positive values so that we say start from step 5, and negative (e.g -100), so that we show up to 100 last steps. For zero we don't use the option at all.

125 has an approach for MatplotlibPlot only. I wanted to do it differently - by creating an option for MainLogger, so that it will propagate to all other elements as well (e.g. ExtremaPrinter).

Right now it "sort of works":

Screenshot 2020-11-30 at 18 58 55

Then later:

image

stared commented 3 years ago

Now the code

groups = {'acccuracy': ['acc', 'val_acc'], 'log-loss': ['loss', 'val_loss']}
plotlosses = PlotLosses(['ExtremaPrinter'], groups=groups, from_step=2)

for i in range(10):
    plotlosses.update({
        'acc': 1 - np.random.rand() / (i + 2.),
        'val_acc': 1 - np.random.rand() / (i + 0.5),
        'loss': 1. / (i + 2.),
        'val_loss': 1. / (i + 0.5)
    })
    plotlosses.send()
    sleep(0.3)

Runs without errors, and generates:

acccuracy
    training             (no values!)
    validation           (no values!)
log-loss
    training             (no values!)
    validation           (no values!)
acccuracy
    training             (no values!)
    validation           (no values!)
log-loss
    training             (no values!)
    validation           (no values!)
acccuracy
    training             (min:    0.778, max:    0.778, cur:    0.778)
    validation           (min:    0.973, max:    0.973, cur:    0.973)
log-loss
    training             (min:    0.250, max:    0.250, cur:    0.250)
    validation           (min:    0.400, max:    0.400, cur:    0.400)

Some auxiliary note: