zadaianchuk / HypergradientDescent

TensorFlow implementation of Hypergradient Descent modification of Adam algorithm
7 stars 1 forks source link

"tfobs" package #1

Open filipre opened 6 years ago

filipre commented 6 years ago

Hey Andrii,

I tried to get your project running but I do not have the "tfobs" package and I cannot find any information about this as well. Is it open source? It looks very useful! I tried conda and pip without any success.

Best regards René

zadaianchuk commented 6 years ago

Hi, René Filip, unfortunately, it is in the development now, so it is not possible to install it. I think that it would be open sourced when it is ready for external usage. This run file in repo is given as the example of usage HyperGradient Optimizer. I can add the full MNIST model and run of the MNIST without tfobs package as another example of usage HyperGradient Optimizer.

filipre commented 6 years ago

Oh that would be very nice if you do that. I am not exactly sure in which format the training and test data comes from. I tried to reproduce the results of that paper myself but your implementation looks already quite sophisticated.

It looks like the ftobs package tries to unify different testing sets in order to enable the user to quickly change it, right? Is there a planned release date?

zadaianchuk commented 6 years ago

I add mnist folder, that uses tensorflow tutorial with ADAM-HD optimizer. This one should be easy to run. If not look at the MNIST tutorial for details.

Yes, you are right. For now it is not, however, I will write you if it will change.

filipre commented 6 years ago

Nice, looking forward it. 🙂