amplab / SparkNet

Distributed Neural Networks for Spark
MIT License
603 stars 172 forks source link

Hyperparameters optimizzation #111

Closed bhack closed 4 years ago

bhack commented 8 years ago

Do you plan to add something similar to https://github.com/maxpumperla/hyperas?

robertnishihara commented 8 years ago

Hyperparameter optimization is a sensible way to use a cluster, so it's worth supporting well. Currently, if people load a model in SparkNet from a Caffe .caffemodel or a TensorFlow .pb file, then the net architecture will already be specified in the model file. Hyperparameter optimization becomes easier if the users specify the models in Scala, but that seems like a less common use case.

bhack commented 8 years ago

Yes but there are also some emerging approach that pass the boundaries concept of hyperparameters search like what we started to discuss at https://github.com/maxpumperla/hyperas/issues/10