-
This is really a nit-picky detail, but there's no shebang ("#!/usr/bin/env python3") at the beginning of the ode_demo.py. Also, for some reason the graphics are not generating. I removed the savefig a…
-
-
- [ ] AI types
- [ ] Heuristics
- [ ] Search algorithm
-
Thank you for your work and release the code generously.
In the google colab sample, the accuracy of the model was about 60%. I was thinking if we could continue to train the predicted model and ac…
-
I noticed your threshold version:
trelu(x, theta=1) = x > theta ? x : 0
and I certainly like the simplicity of it and ReLU.
I'm sure you're open to new functions (I could provide a PRs), but …
-
@cbattista, I ran the images data on a super simplified botzmann NN model (not a deep layer network) unsupervised. The acc is around 60%. I hence recreated the images from the output data.
Below is …
-
1. How many nodes we need to create a neural network?
2. What kind of activation function we have to use?(e.g. sigmoid, tanh, ReLU)
3. What is the exact process of backpropagation?
4. What is regre…
-
Hello!
Thank you for releasing this amazing codebase!
I wanted to try enforcing normal loss on an RGB-D dataset. To do this, I did the following:
1. I generated normals from the Omnidata model,…
-
### **Optimization 1**
**Searched combinations:**
n_hidden_layers = 2 or 3
neurons_in_layer = 32 or 64
dropout = 0 or 0.2
activation_function = _relu_ or _sigmoid_ or _selu_
model_optimizer = …
-
One low hanging fruit in our project is the hotspot classification, i.e. taking the hotspot detections provided by their algorithm and deciding which one actually corresponds to a seal and which not. …