-
We are currently planning a new module to combat spam messages in Odoo. The plan at the moment is to implement a basic backpropagation algorithm [similar to this](http://radimrehurek.com/data_science_…
-
The licence for R6RS-ad is GPL it seems. Is your implementation a clean-room implementation of the papers or have you directly ported the code? Sadly the EPL and GPL are incompatible and the Clojure c…
whilo updated
7 years ago
-
We should discuss how the user will build a neural network in Syft and how it will be implemented internally.
Do we want each Layer (Dense, Convolution..) to be its own class containing weights and…
-
I wanted to make the neural network to learn how to count using number: 1, 2, 3, 4, 5, 6, 7, 8, 9.
I normalized my inputs to 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9.
Then I wrote really simp…
-
The [CrossEntropyLoss](http://pytorch.org/docs/nn.html?highlight=crossentropyloss#torch.nn.CrossEntropyLoss) class and function uses inputs (unscaled probabilities), targets and class weights to calcu…
-
In the dcgan example, while training the discriminator, why is backward called twice ? First its called on the real images, then the fake images.
Instead, shouldn't doing something like:
`totalErr…
-
I have a sequence of Caffe layers with no loss layer in a Caffe network. In my python code, I want to repeatedly take the following steps:
1. Do a forward pass through the network.
2. Compute my own …
-
I have almost finished converting the Backpropagation to a multi-class algorithm. As of now, it is a binary learner. There had been notes left in `learning.py` about the issue (approximately lines 545…
-
I'd started reviewing the MLP class, and there are a couple of things that I need clarification/advice on:
- I was thinking adding the proper parameters to the back-propagation algorithms (for inst…
-
thanks Joao for this awesome repository :)
Autonn saves all variables in network evaluation, including those in layers which do not need the forward pass for derivative computation, like reshape, r…