-
```
def train():
"""Train CIFAR-10 for a number of steps."""
with tf.Graph().as_default(), tf.device('/cpu:0'):
# Create a variable to count the number of train() calls. This equals the
# …
-
Atm we use "strawberry" Gradient Descent Method on the error surface given by the respective error function.
The Question now is: Which Method is best to use?
A Candidate is **Stochastic Gradient …
-
For model training and execution, several design patterns are effective in managing workflows, code structure, and flexibility. Here are a few common ones used in machine learning and data processing …
-
## 🚀 Feature
It would be great to have a library of optimization routines for the deterministic setting (a la `scipy.optimize`) using PyTorch autograd mechanics. I have written a [prototype library](…
-
Hi! I'm a student learning CS285 online. Thank you for your great and generous work!
When I'm doing homework1 and running the same code in two different machines, one Linux and one Windows, I got t…
-
-
Hi, I am getting the following error when running the mnist example via fpm
```
$ fpm run --example mnist --profile debug --compiler ifort
mod_constants.f90 done.
mod_random…
ofmla updated
1 month ago
-
Hi @mari-linhares , thanks for the repo!
We are building on your code to implement a bit more general version of MAML that includes a batch of tasks within the inner loop and several steps of gradien…
-
### Feature details
Would dev-team be interested in implementing [Quantum gradient descent](https://arxiv.org/pdf/1612.01789.pdf) optimization for the TensorFlow backend?
### Implementation
An impl…
-
I've pretty much copy-pasted the code [here](http://tflearn.org/tutorials/quickstart.html#source-code) and still getting an error. Listing the log below:
```
IndexError …