JuliaManifolds / ManifoldDiff.jl

Differentiation on manifolds
https://juliamanifolds.github.io/ManifoldDiff.jl/
MIT License
10 stars 2 forks source link

Some consistency changes to subgrad_distance #25

Closed hajg-ijk closed 1 year ago

hajg-ijk commented 1 year ago

Changes default exponent to be used in the subgrad_distance function to 2 to mimic the behavior of grad_distance, and changes default atol to 0. Finally, returns a tangent vector rescaled by rand() after having normalized it, since any tangent vector of norm less than or equal to one is a subgradient of the distance for p=q. This should make it closer to sampling from a normal cone.

codecov[bot] commented 1 year ago

Codecov Report

Merging #25 (2e2db8c) into main (bea7996) will increase coverage by 0.05%. The diff coverage is 100.00%.

:exclamation: Current head 2e2db8c differs from pull request most recent head b2e09e5. Consider uploading reports for the commit b2e09e5 to get more accurate results

@@            Coverage Diff             @@
##             main      #25      +/-   ##
==========================================
+ Coverage   95.02%   95.08%   +0.05%     
==========================================
  Files          20       15       -5     
  Lines         342      305      -37     
==========================================
- Hits          325      290      -35     
+ Misses         17       15       -2     
Impacted Files Coverage Δ
src/subgradients.jl 100.00% <100.00%> (ø)

... and 6 files with indirect coverage changes

:mega: We’re building smart automated test selection to slash your CI/CD build times. Learn more