Closed chris-n-self closed 3 years ago
Hello @chris-n-self! Thanks for updating this PR. We checked the lines you've touched for PEP 8 issues, and found:
There are currently no PEP 8 issues detected in this Pull Request. Cheers! :beers:
Merging #85 (ccc3e67) into develop (f54f53c) will increase coverage by
0.07%
. The diff coverage is90.00%
.
@@ Coverage Diff @@
## develop #85 +/- ##
===========================================
+ Coverage 81.71% 81.79% +0.07%
===========================================
Files 34 34
Lines 13013 13126 +113
===========================================
+ Hits 10634 10736 +102
- Misses 2379 2390 +11
Impacted Files | Coverage Δ | |
---|---|---|
quimb/tensor/optimize.py | 69.87% <87.71%> (-3.96%) |
:arrow_down: |
quimb/tensor/tensor_core.py | 77.43% <94.44%> (+0.07%) |
:arrow_up: |
quimb/tensor/drawing.py | 77.88% <100.00%> (+0.22%) |
:arrow_up: |
quimb/tensor/tensor_2d_tebd.py | 82.93% <100.00%> (ø) |
|
quimb/tensor/array_ops.py | 84.93% <0.00%> (-1.21%) |
:arrow_down: |
quimb/core.py | 95.07% <0.00%> (-0.15%) |
:arrow_down: |
quimb/__init__.py | 100.00% <0.00%> (ø) |
|
quimb/gen/operators.py | 98.29% <0.00%> (+0.06%) |
:arrow_up: |
... and 1 more |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update ee67688...ccc3e67. Read the comment docs.
I have refactored things so the parse_network_to_backend
function either internally calls an opt-in parser, or an opt-out parser. Since it's more complicated now I added a bit of an explanatory docstring to parse_network_to_backend
@chris-n-self I think in both cases, (especially in opt_in) constant_tags
should not be ignored. They can still play an important role in making things simple for users.
Thanks for the changes @chris-n-self, will hopefully look at today.
@rezah Do you mean you'd like to specify tags marking a sub-set of tensors to optimize, then a sub-sub-set of those tensors not to optimize? I think the logic here was you can just untag the constant tensors instead.
@jcmgray Yes. I agree maybe in the way you are pointing out (untag constants tensors) it is more clear and less confusing for users to makes sure what he is optimizing etc.
I agree with Johnnie, I think if there are too many overlapping subsets then what rules take precedence could get confusing, i.e. if something has both an optimisation tag and a constant_tag then you would have to remember that constant_tag overrules the optimisation tag, rather than the other way around
All looks good to me, thanks for great PR @chris-n-self! I'm going to start an issue collating various ways the docs might be improved, but a page on the TNOptimizer
is definitely overdue if either of you have suggestions on what might be useful to include.
Added 'shared_tags' kwarg to TNOptimizer class, tensors with a
shared_tags
tag will be optimised together. Added a test to tests/test_tensor/test_optimizers.py to covershared_tags