marrlab / DomainLab

modular domain generalization: https://pypi.org/project/domainlab/
https://marrlab.github.io/DomainLab/
MIT License
42 stars 2 forks source link

Erm Hyper Init #843

Closed MatteoWohlrapp closed 3 months ago

MatteoWohlrapp commented 4 months ago

Added functionality to use ERM with the hyperparam scheduling. Alternatively to adding the hyper init and hyper update method to ERM, we could also add them to the a_model superclass, or check if the method exists before invoking in the scheduler.

codecov-commenter commented 3 months ago

Codecov Report

All modified and coverable lines are covered by tests :white_check_mark:

Project coverage is 90.90%. Comparing base (d2ac388) to head (976c25a).

Additional details and impacted files ```diff @@ Coverage Diff @@ ## mhof_dev_merge #843 +/- ## ================================================== + Coverage 90.77% 90.90% +0.12% ================================================== Files 137 137 Lines 5853 5858 +5 ================================================== + Hits 5313 5325 +12 + Misses 540 533 -7 ``` | [Flag](https://app.codecov.io/gh/marrlab/DomainLab/pull/843/flags?src=pr&el=flags&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=marrlab) | Coverage Δ | | |---|---|---| | [unittests](https://app.codecov.io/gh/marrlab/DomainLab/pull/843/flags?src=pr&el=flag&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=marrlab) | `90.90% <100.00%> (+0.12%)` | :arrow_up: | Flags with carried forward coverage won't be shown. [Click here](https://docs.codecov.io/docs/carryforward-flags?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=marrlab#carryforward-flags-in-the-pull-request-comment) to find out more.

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

MatteoWohlrapp commented 3 months ago

You introduced 'flag_info' in your mhof_dev branch. Can you give a brief explanation, I dont think I fully understand the naming. I added it because otherwise training was not possible. It is set to self.flag_setpoint_updated in train_fbopt_b.py.