UKPLab / sentence-transformers

State-of-the-Art Text Embeddings
https://www.sbert.net
Apache License 2.0
15.26k stars 2.47k forks source link

Example of callback functions? #458

Open zeno17 opened 4 years ago

zeno17 commented 4 years ago

Hi,

I was wondering if and where i could find callback functions to be used during training, e.g. i know Tensorflow has EarlyStopping?

nreimers commented 4 years ago

Have a look here: https://github.com/UKPLab/sentence-transformers/pull/327

zeno17 commented 4 years ago

@PhilipMay

  1. I think the docs were updated to https://optuna.readthedocs.io/en/stable/tutorial/007_pruning.html (so your link is dead).
  2. Do you by any chance have example of code that you can share on how i could use Optuna callbacks to optimize a sentence transformer?
PhilipMay commented 4 years ago

I just do this very simple thing:

    def callback(value, a, b):
        print('callback:', value, a, b)
        if math.isnan(value):
            raise optuna.exceptions.TrialPruned()

    model.fit(train_objectives=[(train_dataloader, train_loss)],
              evaluator=dev_evaluator,
              epochs=num_epochs,
              scheduler=trial.suggest_categorical('scheduler', ['WarmupLinear', 'warmupcosine', 'warmupcosinewithhardrestarts']),
              #optimizer_class=optimizer_class,
              evaluation_steps=evaluation_steps,
              warmup_steps=warmup_steps,
              output_path=model_save_path,
              optimizer_params={'lr': lr, 'eps': eps, 'correct_bias': False},
              weight_decay=weight_decay,
              callback=callback,
              )

But since a full training needs many houry I can use this callback to save a lot of time.

zeno17 commented 4 years ago

Could you post your objective function aswell?

PhilipMay commented 4 years ago

@zeno17 here is just the full code - feedback welcome. :-)

https://gist.github.com/PhilipMay/e4034f92e098aed1b72ce146c104a17f#file-all_nli_de-py

milmin commented 3 years ago

@nreimers @PhilipMay do you have any example of a valid callback for tracking train and validation loss? (sorry, I'm quite new to pytorch...)

nreimers commented 3 years ago

Hi @milmin You can use the evaluator parameter and pass multiple evaluators by wrapping them in a evaluation.SequentialEvaluator(). There you can compute your various validation scores you want to track.

milmin commented 3 years ago

Thanks for your reply, is there already such an example in the doc? It would be definitely very useful for newcomers by the way :-)

nreimers commented 3 years ago

The usage is rather simple:

model.fit(evaluator=evaluation.SequentialEvaluator([evaluator1, evaluator2, evaluator3]), ...)

with evaluator1/2/3 your different evaluators.

Note, by default, the value returned by the last evaluator will be used if a new model checkpoint will be saved.

milmin commented 3 years ago

What about if I want to track train metrics as well?

nreimers commented 3 years ago

This is currently not supported.

You could create a new evaluator with some of your training data and your training loss and then compute the loss on it.

milmin commented 3 years ago

I tried something like evaluation.SequentialEvaluator([train_evaluator, dev_evaluator]) to track the train and validation scores at least at the evaluation steps, however the evaluator returns only one value: as described in the doc

All scores are passed to ‘main_score_function’, which derives one final score value

how is this final score value computed? Can you give some more details? I think it would be very useful to have one value for every evaluator at each evaluation step.

tempdeltavalue commented 9 months ago

Can I get training model inside of custom callback function ? Like in keras: https://stackoverflow.com/questions/65302030/access-loss-and-model-in-a-custom-callback