Closed simonrasmu closed 2 years ago
The learning rate is set at 0.0001; I think you meant the beta value there, which is set to 1e-5. I will make saving it as format(float('1.e-05'), 'f'), which will save it as 0.00001.
Regarding repeats - @rosaallesoe could say it better if we can reduce it.
So it seems using format(float('1.e-05'), 'f'), saves a number as a string. Therefore, this way could not be used, and I didn't find any other way how to suppress scientific expression in this context. However, beta values did not have any effect, therefore it could be changed to 0.0001. Also, as the example .yaml files, we could just set 0.00001 (instead of 1e-05) as well.
You can also specify float format, e.g. f"{1e-05:.5f}"
will give '0.00001'
@enryH I thought at the beginning about this way and tried to implement it. However, the f"{1e-05:.5f}" format makes a string in the format '0.00001', so I think we should just stick to the way I wrote in the last comment.
Yes perfect
When step 02 is complete add
repeats: 5
in the tuning_stability.yaml, this could potentially be decreased to 3 or 4 to save computational time?Could learning rate be reduced to only 0.0001 and thus remove 1e-5?