yzhao062 / MetaOD

Automating Outlier Detection via Meta-Learning (Code, API, and Contribution Instructions)
BSD 2-Clause "Simplified" License
171 stars 29 forks source link

About the optimisation learning rates #4

Closed VConchello closed 2 years ago

VConchello commented 2 years ago

What was the criterion used to choose the learning rates on core.py:118-136? It looks like it alternates between increasing and decreasing during the iterations? And at the beginning the learning rate is increasing?

yzhao062 commented 2 years ago

"In METAOD, we employ two strategies that help stabilize the training. First, we leverage meta-feature based (rather than random) initialization. Second, we use cyclical learning rates that help escape saddle points for better local optima [43]."

[43] L. N. Smith. Cyclical learning rates for training neural networks. In WACV, pages 464–472. IEEE Computer Society, 2017.

we do use this technique for better training :)

VConchello commented 2 years ago

Mmmh, I see. Thank you for answering so fast and clear :)