-
Hi, just a suggestion from a passer-by: I am interested in understanding the advantages of this library over raw mxnet and would love to see more discussion of it. Auto-differentiation definitely see…
-
Our attention module is essentially standard multi-head attention, just with arbitrarily many bias tensors. It’s approximately as follows:
```
a = torch.matmul(q, k)
for b in biases:
a += …
-
I'm a newbie using the autodiff library and more general using automatic differentiation in general. I was trying to implement gradient computation for a simple Gauss-Seidel type solver. It runs fine …
-
A snippet of code to test auto-differentiation across devices fails on my machine and I don't understand why. The code is copied from the test_cross_device_autograd function in [test_operator_gpu.py]…
-
### Version
5.0.0-beta.3
### Environment info
```
Environment Info:
System:
OS: macOS 11.5.2
CPU: (8) x64 Intel(R) Core(TM) i5-8279U CPU @ 2.40GHz
Binaries:
No…
-
So to run moving mesh problems with perfect Jacobians in MOOSE I've developed methods like [`Assembly::computeSinglePointMapAD`](https://github.com/idaholab/moose/blob/next/framework/src/base/Assembly…
-
## Proposal
HoloPy version 4 should rely on [`pymc`](https://www.pymc.io/welcome.html) for inference.
## Rationale
When we first included Bayesian inference in HoloPy, the options for efficient …
-
**Flexibility and performance improvements**
The current implementation of projection optimization consists of two layers.
The optimization layer (\optimizers\*.py) is implemented with numpy on CP…
-
**Minimal example**: Compile the following code with Clang++ with the clad plugin:
```c++
#include
#include
#include "clad/Differentiator/Differentiator.h"
double f(double x){
double y;
…
-
For the direct usage of `Fun`, stiff methods can't be used because the Jacobians cannot be calculated.
```julia
@time sol=solve(prob,Rosenbrock23(),dt=1/40, adaptive=false, tstops=tstops)
Met…