Closed HugoGranstrom closed 1 year ago
I've (finally) finished the drafts of 4 articles now. All feedback is appreciated :D
On the ODE article, I think I will remove all the comparisons between all the methods and instead just choose a few. It feels like it slows down the build time for no added value.
Nice, I'll see if I can take a look tomorrow or Wednesday
Yup, I will also give it a read soon!
Hi, nice work. Started reading too!
Remarks on curve fitting:
FOR ODE:
For optimization:
tol
)On interpolation:
All in all very nice work, I think the examples are all very good for showing up the libraries. Great job!
Hi, nice work. Started reading too!
Many thanks for the feedback :smile: Most of them have been fixed now.
noted that randomTensor actually is not Gaussian noise but uniform from 0 to max
Good catch! I opted for just using uniform noise instead, I couldn't find any builtin Gaussian noise constructor in arraymancer.
in the last benchy run the header are missing, not sure why...
I think benchy
only puts the header on the first call in the file. Otherwise it would insert the header before every benchmark. It is controlled by a non-exported variable here so I don't think there is much we can do right now sadly.
you apply 3 methods, are all the methods available in numerical Nim? A link to their Wikipedia could be useful (at least to me, did not know about LFGBS)
I have added links to most of the methods (except ODE because that list is autogenerated). It's a good point that the reader might want to read up on things more themselves as well.
on analytical gradient you mention it improves time but you do not check time when applying to LFGBS, you could run benchy for that too
The problem is so small, so the difference isn't too big between numerical and analytical. It's basically 4 calls to f
for the numerical one. I ran a benchmark now and got 3.3ms vs 3.1ms. It's when we start to add many more parameters that the difference should start to show. And I'm not tempted to add such an example here just to prove a point :sweat_smile:
is there a reference api page in numericalnim that could be linked? I guess this could be useful also for the other pages
Not currently, but I should really get a nim doc
documentation added to the CI. I tried locally now and it's so simple :o
to check difference on 2d functions maybe you could plot the heat map of error? Comparing two heat maps is hard. Also it could be nice to plot the grid points
Very good point, the heatmap turned out rather pretty as well :) You can really see the areas between the points with higher errors.
@pietroppeter
Missed this question:
not sure though if it [interpolation] can be use in optimization, does it extrapolate also?
It does not extrapolate, and I'm not sure if it had helped even if it did. Extrapolations are often either constant or linear and none of them are really great for optimization as the first has zero derivative and the second one goes on forever. What could make interpolation useful in optimization, though, is if we can add a bounding box to the optimization routine so that it stops when it reaches the edge and only walks along it instead of crossing it.
Also, I have set up docs for numericalnim finally and have added links to it in Further reading
.
I think I have taken almost all feedback into consideration now. So I'll read through it a few times tomorrow and merge this in the evening unless I or someone else finds any important errors.
Here we go, finger crossed for no build errors! :crossed_fingers:
It's due time I at least write some tutorials for
numericalnim
's functionalities. Plus, it would allow me to clean up my massive README and just link it here :sunglasses: