In order to easily use katib hyperparameter tuning, we want keep the steps_per_eval unchanged when batch size changed.
Fixes # (issue)
Type of change
Please delete options that are not relevant.
[x] Bug fix (non-breaking change which fixes an issue)
List all changes
Please list all changes in the commit.
set steps_per_eval = max(1, num_train_steps / total_eval_steps)
Testing
Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration
I set total_eval_steps in run_detext.sh. Then test it using bash run_detext.sh. The steps_per_eval will set by max(1, num_train_steps / total_eval_steps)
Test Configuration:
Firmware version:
Hardware:
Toolchain:
SDK:
Checklist
[x] My code follows the style guidelines of this project
[x] I have performed a self-review of my own code
[x] I have commented my code, particularly in hard-to-understand areas
[x] I have made corresponding changes to the documentation
[x] My changes generate no new warnings
[x] I have added tests that prove my fix is effective or that my feature works
[x] New and existing unit tests pass locally with my changes
[x] Any dependent changes have been merged and published in downstream modules
Description
In order to easily use katib hyperparameter tuning, we want keep the steps_per_eval unchanged when batch size changed.
Fixes # (issue)
Type of change
Please delete options that are not relevant.
List all changes
Please list all changes in the commit.
Testing
Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration
I set total_eval_steps in run_detext.sh. Then test it using bash run_detext.sh. The steps_per_eval will set by max(1, num_train_steps / total_eval_steps)
Test Configuration:
Checklist