gbaydin / hypergradient-descent

Hypergradient descent
MIT License
136 stars 20 forks source link

Make hypergrad pip-installable #3

Closed neighthan closed 5 years ago

neighthan commented 5 years ago

I was looking at your paper today and wanted to try it on a model of my own. Thanks for providing the code! However, I think it would be more accessible if it was pip-installable. This PR adds just that. Note that you don't actually even need to upload anything to PyPI; users can install with pip directly from your GitHub repo if you just provide the setup.py file necessary and have things structured properly. All I've done here is created a basic setup.py and moved the adam_hd and sgd_hd files into the folder that will be installed. I import both of them in the __init__ just to make the imports as easy as possible for users. I've also edited the README to show how to install the package (referencing your GitHub repo, not mine) and updated the two places that I found where importing the HD optimizers was done/referenced: in the README and in train.py (now the imports match what you would do if you pip-installed the package).

I didn't put the code for vgg or train into the installable package because I figured these aren't what users will need to import for their own projects, but you could easily move things around if you want.

calclavia commented 5 years ago

Would be nice if this was merged!

gbaydin commented 5 years ago

This is great @neighthan ! Thank you very much, I'm merging now. Also sorry for noticing this pull request with some delay.