Closed ho-oto closed 4 years ago
Thanks, this seems like a useful addition. I'll try to review and merge soon.
By the way, I notice you are a PhD student working on tensor networks etc. What are your research interests? We are always looking for people interested in tensor networks and with good coding skills, both as collaborators are as potential team members.
Merging #85 into master will increase coverage by
1.26%
. The diff coverage is100.00%
.
@@ Coverage Diff @@
## master #85 +/- ##
==========================================
+ Coverage 72.75% 74.01% +1.26%
==========================================
Files 22 22
Lines 1824 1774 -50
==========================================
- Hits 1327 1313 -14
+ Misses 497 461 -36
Impacted Files | Coverage Δ | |
---|---|---|
src/indexnotation/optdata.jl | 96.36% <100.00%> (+53.50%) |
:arrow_up: |
src/implementation/indices.jl | 91.30% <0.00%> (-6.57%) |
:arrow_down: |
src/implementation/stridedarray.jl | 82.32% <0.00%> (-1.52%) |
:arrow_down: |
src/indexnotation/verifiers.jl | 82.41% <0.00%> (-0.38%) |
:arrow_down: |
src/functions/ncon.jl | 100.00% <0.00%> (ø) |
|
src/functions/simple.jl | 100.00% <0.00%> (ø) |
|
src/indexnotation/optimaltree.jl | 90.60% <0.00%> (+0.13%) |
:arrow_up: |
src/implementation/diagonal.jl | 70.47% <0.00%> (+1.17%) |
:arrow_up: |
src/indexnotation/preprocessors.jl | 60.00% <0.00%> (+2.62%) |
:arrow_up: |
src/indexnotation/tensormacros.jl | 68.33% <0.00%> (+6.66%) |
:arrow_up: |
... and 1 more |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update c9a9519...1bbce09. Read the comment docs.
Thank you for your interest in my profile!
My recent research interests include quantum thermalization, quantum many-body chaos, and quantum many-body scars problems. I'm very interested in the relationship between them and tensor-network.
I interested in the development of tensor-network algorithms as well. I think the application of differentiable programming techniques to tensor-network algorithms is very interesting. I'm also interested in the isometric tensor-network.
In my recent research with my collaborators (arXiv:2003.01705), I calculated the ground state of spinful soft-core bosonic system by VUMPS. I believe my computation is correct, but I feel I should improve the algorithm for this kind of problem...
I somewhat forgot about that and have been quite busy. I'll try to review and merge soon, but feel free to ping me if you don't see any change the next few days.
@Jutho What is the status of this?
In general this looks good. There is a bit of code duplication between the =>
and =
way of specializing costs, up to small differences caused by the fact that =>
is an Expr(:call,:(=>),...)
whereas =
yields an Expr(:(=), ...)
. Is the =
way of specifying costs really important.
I am not necessarily opposed to it, but I think I initially choose something different (i.e. =>
) because e.g.
a=5reads strangely, it's rather
a=1:5, i.e.
a` does not take this value but ranges up to this value. I am just wondering if having different slight syntax variations for the same thing is a very good or maintainable practice.
I think your concerns about maintainability are valid. I've rewritten it to make the logic easier to understand.
I feel the =
syntax is not so strange since it is similar to the NamedTuple syntax of Julia (https://docs.julialang.org/en/v1/base/base/#Core.NamedTuple)...
@Jutho Could you re-review this PR?, if you have time.
Thanks, this looks good. I am still not superconvinced about the two different ways of specifying costs (i.e. =>
and =
). One should then probably also wonder what happens if someone mixes the two (a=>4, b=5, c=>8, d=9). Currently this will error.
But I'll merge it anyway, it's not a big deal. For now, it is an undocumented feature.
This pull request make it possible to write codes with
@tensoropt
macro likeshorter and clearer, by using tuple:
It also enable us to use
=
instead of=>
: