Jutho / TensorOperations.jl

Julia package for tensor contractions and related operations
https://jutho.github.io/TensorOperations.jl/stable/
Other
453 stars 56 forks source link

Support to specify index costs together by tuple #85

Closed ho-oto closed 4 years ago

ho-oto commented 4 years ago

This pull request make it possible to write codes with @tensoropt macro like

@tensoropt (a=>χ,b=>χ^2,c=>2*χ,d=>χ,e=>5,f=>2*χ) D1[a,b,c,d] := A[a,e,c,f]*B[g,d,e]*C[g,f,b]

shorter and clearer, by using tuple:

@tensoropt ((a,d)=>χ,b=>χ^2,(c,f)=>2*χ,e=>5) D3[a,b,c,d] := A[a,e,c,f]*B[g,d,e]*C[g,f,b]

It also enable us to use = instead of =>:

@tensoropt (a=χ,b=χ^2,c=2*χ,d=χ,e=5,f=2*χ) D2[a,b,c,d] := A[a,e,c,f]*B[g,d,e]*C[g,f,b]
@tensoropt ((a,d)=χ,b=χ^2,(c,f)=2*χ,e=5) D4[a,b,c,d] := A[a,e,c,f]*B[g,d,e]*C[g,f,b]
Jutho commented 4 years ago

Thanks, this seems like a useful addition. I'll try to review and merge soon.

Jutho commented 4 years ago

By the way, I notice you are a PhD student working on tensor networks etc. What are your research interests? We are always looking for people interested in tensor networks and with good coding skills, both as collaborators are as potential team members.

coveralls commented 4 years ago

Coverage Status

Coverage increased (+1.5%) to 72.181% when pulling 1bbce09598c89c6778bf6d5c8e3c6647e566764d on ho-oto:master into c9a951944c591600f530a59836dc7ea2eb661697 on Jutho:master.

coveralls commented 4 years ago

Coverage Status

Coverage increased (+1.5%) to 72.144% when pulling 5592056b5a0af8ba0783e540580980953407c586 on ho-oto:master into c9a951944c591600f530a59836dc7ea2eb661697 on Jutho:master.

codecov[bot] commented 4 years ago

Codecov Report

Merging #85 into master will increase coverage by 1.26%. The diff coverage is 100.00%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master      #85      +/-   ##
==========================================
+ Coverage   72.75%   74.01%   +1.26%     
==========================================
  Files          22       22              
  Lines        1824     1774      -50     
==========================================
- Hits         1327     1313      -14     
+ Misses        497      461      -36     
Impacted Files Coverage Δ
src/indexnotation/optdata.jl 96.36% <100.00%> (+53.50%) :arrow_up:
src/implementation/indices.jl 91.30% <0.00%> (-6.57%) :arrow_down:
src/implementation/stridedarray.jl 82.32% <0.00%> (-1.52%) :arrow_down:
src/indexnotation/verifiers.jl 82.41% <0.00%> (-0.38%) :arrow_down:
src/functions/ncon.jl 100.00% <0.00%> (ø)
src/functions/simple.jl 100.00% <0.00%> (ø)
src/indexnotation/optimaltree.jl 90.60% <0.00%> (+0.13%) :arrow_up:
src/implementation/diagonal.jl 70.47% <0.00%> (+1.17%) :arrow_up:
src/indexnotation/preprocessors.jl 60.00% <0.00%> (+2.62%) :arrow_up:
src/indexnotation/tensormacros.jl 68.33% <0.00%> (+6.66%) :arrow_up:
... and 1 more

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update c9a9519...1bbce09. Read the comment docs.

ho-oto commented 4 years ago

Thank you for your interest in my profile!

My recent research interests include quantum thermalization, quantum many-body chaos, and quantum many-body scars problems. I'm very interested in the relationship between them and tensor-network.

I interested in the development of tensor-network algorithms as well. I think the application of differentiable programming techniques to tensor-network algorithms is very interesting. I'm also interested in the isometric tensor-network.

In my recent research with my collaborators (arXiv:2003.01705), I calculated the ground state of spinful soft-core bosonic system by VUMPS. I believe my computation is correct, but I feel I should improve the algorithm for this kind of problem...

Jutho commented 4 years ago

I somewhat forgot about that and have been quite busy. I'll try to review and merge soon, but feel free to ping me if you don't see any change the next few days.

ho-oto commented 4 years ago

@Jutho What is the status of this?

Jutho commented 4 years ago

In general this looks good. There is a bit of code duplication between the => and = way of specializing costs, up to small differences caused by the fact that => is an Expr(:call,:(=>),...) whereas = yields an Expr(:(=), ...). Is the = way of specifying costs really important.

I am not necessarily opposed to it, but I think I initially choose something different (i.e. =>) because e.g.a=5reads strangely, it's rathera=1:5, i.e.a` does not take this value but ranges up to this value. I am just wondering if having different slight syntax variations for the same thing is a very good or maintainable practice.

ho-oto commented 4 years ago

I think your concerns about maintainability are valid. I've rewritten it to make the logic easier to understand.

I feel the = syntax is not so strange since it is similar to the NamedTuple syntax of Julia (https://docs.julialang.org/en/v1/base/base/#Core.NamedTuple)...

ho-oto commented 4 years ago

@Jutho Could you re-review this PR?, if you have time.

Jutho commented 4 years ago

Thanks, this looks good. I am still not superconvinced about the two different ways of specifying costs (i.e. => and =). One should then probably also wonder what happens if someone mixes the two (a=>4, b=5, c=>8, d=9). Currently this will error.

But I'll merge it anyway, it's not a big deal. For now, it is an undocumented feature.