JeisonPham / ECE-285-Project

0 stars 0 forks source link

Test out tunable parameters that we can apply evolutionary algorithm to #5

Open JacobGlennAyers opened 1 year ago

JacobGlennAyers commented 1 year ago

We don't want to settle on a certain kind of evolutionary algorithm without knowing what modifications are allowable given that we are working with a convex network.

JacobGlennAyers commented 1 year ago

Jacob has a bit of intuition on Neural Architecture Search Algorithms, but Jason knows more about Convex networks, so this will require both heads.

JacobGlennAyers commented 1 year ago

It seems that using Neural Architecture Search that involves adding layers runs the inherent risk of making the final deeper network no longer convex.

However, that leads hyperparameters such as the learning rate, vNN Solver, regularization, and weight initializations open to a basic binary tree type evolutionary algorithm!

JacobGlennAyers commented 1 year ago

With that being said, we can take a "convex-relaxation" approach where we build out a deep network that is guided by convex methods.

JacobGlennAyers commented 1 year ago

Make use of the Convex Factory Method to build out several possible child models. The sample space might be small enough that we can reasonably search the space.

I should create several lists that each represent the tuned parameters. I can then create a combinational set to reasonably search the space. From there, these lists should be added into a Pandas Dataframe. There we can train all of these different combinations and maintain the important metrics in different dataframe columns. We can also store test metrics as well.

Just keep in mind that each model will take ~3 minutes to train according to Jason