Gaussian Process Based Optimization
The project demonstrates optimization of hyper-parameters using Bayesian Global Optimization.
Details:
- Convolutional Neural Network (3Conv2D + 3Dense)
- Street View House Number Dataset: Link
- Hyperparameters under consideration
.. 1.Starting Learning Rate
.. 2. Learning Rate Decay
.. 3. Mini-batch size
.. 4. Dropout parameter p1 for first fully connected layer
.. 5. Dropout parameter p2 for second fully connected layer
Tasklist:
- [X] Boilercode
- [X] Mid-term Report
- [ ] Implement Bayesian Global Optimization: Link
- [ ] Hyperparameter tunig
- [ ] Performance Analysis
References:
- J. Snoek, H. Larochelle, and R. P. Adams, “Practical bayesian optimization of machine learning algorithms,”
in Advances in neural information processing systems, 2012, pp. 2951–2959
- K. Weinberger, “Bayesian global optimization,” http://www.cs.cornell.edu/courses/cs4780/2018fa/lectures/lecturenote15.html,2018.
Contributors:
- Ankush Tale
- Shubham Shah
- Amey Athale
- Uday PB