-
### Have you completed your first issue?
- [X] I have completed my first issue
### Guidelines
- [X] I have read the guidelines
- [x] I have the link to my latest merged PR
### Latest Merged PR Lin…
-
Tengo una duda con las B particiones a usar, en base a lo que entendi, estas particiones de datos se deben de recalcular en cada epoca del entrenamiento, en este sentido ¿los datos que esten en estas …
-
A very helpful optimization technique that is quite time efficient and even reduces space complexity. It proves to come in quite handy when the gradient descent seems to have a relatively fast learni…
-
Currently, we support batch gradient descent (e.g., LogisticRegression), and stochastic gradient descent (e.g., ``SGDClassifier/SGDRegressor``), but we do not support Mini-Batch gradient descent (``SG…
-
-
In the pdf it looks like the "Mini batch Stochastic Gradient Descent" explanation is repeated from SGD.
sri9s updated
5 years ago
-
We should definitely add Stein variational gradient descent ([paper](https://proceedings.neurips.cc/paper/2016/file/b3ba8f1bee1238a2f37603d90b58898d-Paper.pdf), [code](https://github.com/blackjax-devs…
-
I would like to suggest adding Optimization functions, machine learning and deep learning algorithms in this repository. This can be added in a separate folder called `machine_learning` folder. Furthe…
-
Hi! I don't know if i got it right reading documentation and examples. However my question is: in order to train a neural network in full bartch mode (that is, using all the available instancdes), is …
-
Add different gradient descent techniques like stochastic gradient descent and mini batch gradient descent.