-
### Have you completed your first issue?
- [X] I have completed my first issue
### Guidelines
- [X] I have read the guidelines
- [x] I have the link to my latest merged PR
### Latest Merged PR Lin…
-
Tengo una duda con las B particiones a usar, en base a lo que entendi, estas particiones de datos se deben de recalcular en cada epoca del entrenamiento, en este sentido ¿los datos que esten en estas …
-
Currently, we support batch gradient descent (e.g., LogisticRegression), and stochastic gradient descent (e.g., ``SGDClassifier/SGDRegressor``), but we do not support Mini-Batch gradient descent (``SG…
-
-
In the pdf it looks like the "Mini batch Stochastic Gradient Descent" explanation is repeated from SGD.
sri9s updated
5 years ago
-
Hi! I don't know if i got it right reading documentation and examples. However my question is: in order to train a neural network in full bartch mode (that is, using all the available instancdes), is …
-
Add different gradient descent techniques like stochastic gradient descent and mini batch gradient descent.
-
I would like to suggest adding Optimization functions, machine learning and deep learning algorithms in this repository. This can be added in a separate folder called `machine_learning` folder. Furthe…
-
The [example in the docs](https://nemos.readthedocs.io/en/latest/generated/api_guide/plot_05_batch_glm/) currently uses a custom loop to implement stochastic gradient descent.
An alternative would …
-
In chap8, the code of Batch gradient descent is confusing.
```python
for j in range(iterations):
error, correct_cnt = (0.0, 0)
for i in range(int(len(images) / batch_size)):
batc…