-
## Description
I built mxnet from source using cpp packages, but the static library libmxnet.a is not available.
## To Reproduce
```
root@5e2873bd79c3:/mxnet2# cp config/linux.cmake config.cmake…
nazbn updated
2 years ago
-
May I please learn what is the best available optimizer here for training Generative Adversarial Networks?
Are there such a benchmark available anywhere? Only AdaBelief explicitly mentions GANs in…
-
## 🚀 Feature
Add Adabelief optimizer to the C++ API
## Motivation
I have been using this new optimizer recently with the Python API and found out it worked extremely well (for my use cases at lea…
-
HI!
I had some trouble using Adambelief in a simple lstm training.
What could be the reason for this?
CODE:
from adabelief_tf import AdaBeliefOptimizer
tf.keras.backend.clear_session()
multivari…
-
Thank you very much for your work on this project! It really is an excellent contribution to provide an up-to-date version of AdamW that allows layer-dependent learning rates. I'm wondering what you…
-
Hi.
The eps hyper-parameter is set to 1e-3 by default in the [current implementation](https://github.com/jettify/pytorch-optimizer/blob/1b505fac39446dbedf3266016e6a2c8090106670/torch_optimizer/ada…
sidml updated
3 years ago
-
Have you ever tested adabelief for fine-tuning bert models? And what's the recommended hyper-parameters?
-
Hey @juntang-zhuang
I'm just trying out the optimisers and noticed that `ranger-adabelief` has a couple of [debug prints in]( https://github.com/juntang-zhuang/Adabelief-Optimizer/blob/master/pypi…
-
AdaBeliefなどに変えてみる。
*****************************************
実験によりschedulerがある時はAdamが一番良いことがわかったため、Adamのままにする。
-