-
https://github.com/lanl-ansi/inverse_ising/blob/42291444bd9ac935c958de7a3e6a356fa9886464/Inverse_Ising.jl#L84
High risk of underflow/overflow from chaining logs, sums and exponentials. I don't know…
-
There are a lot of extra tools that `DifferentialEquations.jl` provides that I want to be able to use (e.g. sensitivity analysis and optimal control, which I would prefer to do directly, instead of ha…
-
Hello,
I am getting this error when i ran "butterfly_multiply" module.
ModuleNotFoundError Traceback (most recent call last)
in
4 from torch import nn
5
---…
-
I'm interested in trying to solve some nonlinear BVPs with undetermined parameters, but as of now I can't even set up the system using ApproxFun. The equations are
0 = u'' + c u' + f(u,v)
0 = …
-
Hi, all,
Could you also release the following tutorials?
```
Denoising Diffusion Probabilistic Models
Coding Autodifferentiation (From Scratch)
Introduction to JAX, Neural Differential Equat…
-
In this [comment](https://github.com/google-deepmind/optax/issues/977#issuecomment-2441884034), optimistix is mentioned as improving performance by ensuring the objective function is compiled only onc…
-
Recently, there has been a growth of interest in automatic differentiation tools used in adjoint modelling. Some popular projects in the Python and Julia worlds are [JAX](https://github.com) and
[Jul…
-
It would be very nice to have support for AD in `Base.expm`. I would like to contribute to it, but since I am unfamiliar with the AD architecture in Julia I would appreciate some advice.
For Freche…
tpapp updated
2 years ago
-
Hello, I am grateful to your code. I am working in brain segmentation which has 4 classes: WM, GM, CSF, and background. You are using DICE as loss function to maximize. It is very good idea but it may…
-
### Feature details
For good reason, the autograd features of pennylane don't work with circuits that conclude with `qml.sample()`. However, there are many circumstances wherein one would want to d…