ITensor / NDTensors.jl

A Julia package for n-dimensional sparse tensors.
Apache License 2.0
27 stars 7 forks source link

setstorage functions #72

Closed emstoudenmire closed 3 years ago

emstoudenmire commented 3 years ago

Fixes #71

emstoudenmire commented 3 years ago

When I use this updated code with ITensors.jl, for an ITensor T, calling real(T) and imag(T) now works correctly for the dense, diag, and block sparse cases. Before this PR, the same code was causing the block sparse case to end up with additional zero-valued blocks getting allocated and added to the storage.

mtfishman commented 3 years ago

Thanks, I think this pattern of using setstorage and setdata will clean up the code a lot and make it easier to make generic code across storage types.

Could you add DiagBlockSparse as well? That has been a corner case for these types of functions (https://github.com/ITensor/ITensors.jl/issues/485).

emstoudenmire commented 3 years ago

Sure thing. I just implemented setdata for DiagBlockSparse as well. Hopefully all the checks will pass still.

I checked also that dag of a DiagBlockSparse now seems to work correctly, apparently fixing ITensor/ITensors.jl#485

codecov-io commented 3 years ago

Codecov Report

Merging #72 (4460981) into master (2dc95f3) will increase coverage by 2.58%. The diff coverage is 37.77%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master      #72      +/-   ##
==========================================
+ Coverage   32.03%   34.61%   +2.58%     
==========================================
  Files          24       24              
  Lines        2819     2886      +67     
==========================================
+ Hits          903      999      +96     
+ Misses       1916     1887      -29     
Impacted Files Coverage Δ
src/blocksparse/blocksparsetensor.jl 43.78% <ø> (+2.22%) :arrow_up:
src/blocksparse/diagblocksparse.jl 10.90% <0.00%> (+0.57%) :arrow_up:
src/empty.jl 0.00% <0.00%> (ø)
src/diag.jl 14.49% <33.33%> (+3.21%) :arrow_up:
src/blocksparse/blocksparse.jl 13.92% <42.85%> (+0.73%) :arrow_up:
src/dense.jl 46.93% <50.00%> (+4.11%) :arrow_up:
src/tensor.jl 61.11% <54.54%> (+10.62%) :arrow_up:
src/tensorstorage.jl 77.14% <100.00%> (+10.47%) :arrow_up:
src/linearalgebra.jl 32.06% <0.00%> (-0.37%) :arrow_down:
src/svd.jl 0.00% <0.00%> (ø)
... and 13 more

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update ff285a4...4460981. Read the comment docs.

mtfishman commented 3 years ago

This looks good to me, ready to merge this?

The only thing looking over it that is obvious to improve is making use of the new functionality to decrease code duplication, for example definitions like:

(D::BlockSparse * x::Number) = setdata(D, x*data(D))

should now be generic across all storage types, so could be replaced by a single definition:

(S::TensorStorage * x::Number) = setdata(S, x*data(S))

This could be left for future work, and there will be a long tail of changes like that. I'll list some of the ones I see in a separate issue.