prolearner / hypertorch

MIT License
119 stars 16 forks source link

Hyperparameter Optimization on MLP #7

Closed andeskyl closed 2 years ago

andeskyl commented 2 years ago

Hello, I am an undergraduate student who is trying to perform hyperparameter optimization on a simple MLP with the MNIST dataset. I started to build from the code in logistic_regression.ipynb, but I found that the conversion is not very trivial. Do you mind to provide me some hint for doing that? Thanks a lot!

prolearner commented 2 years ago

If you want to use an MLP as the inner model, or any neural network, you should probably look at the iMAML.py file or directly at higher which is used to get stateless version of torch nn.Module-s. Unfortunately PyTorch does not have functional/stateless version of its module so higher is be used to achieve this from any nn.Module.

The alternative is to redefine the neural network in functional form using torch.Parameters and its built in functions. But this can be time-consuming for complex networks.