issues
search
ebetica
/
autogradpp
Direct C++ Interface to PyTorch
MIT License
80
stars
12
forks
source link
Adagrad optimizer
#28
Closed
eugene-kharitonov
closed
6 years ago
eugene-kharitonov
commented
6 years ago
Doesn't support sparse tensors (just as e.g. Adam impl)
Generalized XOR UT to accept other optimizers/models so that can test Adagrad and SGD in a shared code (not sure what would be the best UT, feel free to suggest another way to test)