-
```
What steps will reproduce the problem?
1.How to differentiate numeric log expression in symja?
2.Do you support like this equation [log(base10)[x]] and [log(base a)[x]]
3.
What is the expected ou…
-
- [ ] Allow `map` and `gradient` arguments to be specified as `purrr`-style formulas.
- [x] Allow the user to flag the `gradient` as "linear", in which case its (constant) value will be computed by c…
-
```
What steps will reproduce the problem?
1.How to differentiate numeric log expression in symja?
2.Do you support like this equation [log(base10)[x]] and [log(base a)[x]]
3.
What is the expected ou…
-
```
What steps will reproduce the problem?
1.How to differentiate numeric log expression in symja?
2.Do you support like this equation [log(base10)[x]] and [log(base a)[x]]
3.
What is the expected ou…
-
```
What steps will reproduce the problem?
1.How to differentiate numeric log expression in symja?
2.Do you support like this equation [log(base10)[x]] and [log(base a)[x]]
3.
What is the expected ou…
-
Most of the module requires unittests still. At the moment I am using this to list features which need tests which I might forget about:
1. The numerical differentiation routines in the default lit…
-
Complex back propagration algorithms and formulas:
https://giggleliu.github.io/complex_bp/ (link 1)
see https://github.com/GiggleLiu/poorman_nn for a python version realization.
- [ ] complex num…
-
For example, I want to compare the performances of neural networks with different number of hidden layers. Instead of defining neural networks in compile-time, can I create neural networks in run-time…
-
```
What steps will reproduce the problem?
1.How to differentiate numeric log expression in symja?
2.Do you support like this equation [log(base10)[x]] and [log(base a)[x]]
3.
What is the expected ou…
-
```
What steps will reproduce the problem?
1.How to differentiate numeric log expression in symja?
2.Do you support like this equation [log(base10)[x]] and [log(base a)[x]]
3.
What is the expected ou…