facebookresearch / nevergrad

A Python toolbox for performing gradient-free optimization
https://facebookresearch.github.io/nevergrad/
MIT License
3.89k stars 349 forks source link

EarlyStopping based no improvement interval #1560

Closed irumata closed 9 months ago

irumata commented 9 months ago

Types of changes

TL;DR

Early stop when no loss reduces during tolerance_window asks Example of using : optimizer.register_callback("ask", ng.callbacks.EarlyStopping.no_improvement_stopper(no_imp_window))

tested python3 -m pytest test_callbacks.py

Motivation and Context / Related issue

Provide early stoping feedback based on no improvemen frequent case for optimization

there are several issues connected to that early stoping implementation https://github.com/facebookresearch/nevergrad/issues/714 https://github.com/facebookresearch/nevergrad/issues/589

How Has This Been Tested (if it applies)

I've provided different unit tests to test it python3 -m pytest test_callbacks.py

Checklist