Closed carrascomj closed 4 years ago
Merging #682 into develop will decrease coverage by
0.01%
. The diff coverage is100%
.
@@ Coverage Diff @@
## develop #682 +/- ##
===========================================
- Coverage 82.96% 82.94% -0.02%
===========================================
Files 49 49
Lines 2630 2639 +9
Branches 430 430
===========================================
+ Hits 2182 2189 +7
- Misses 356 358 +2
Partials 92 92
Impacted Files | Coverage Δ | |
---|---|---|
src/memote/suite/cli/reports.py | 75.55% <100%> (-1.89%) |
:arrow_down: |
src/memote/suite/api.py | 67.79% <100%> (+0.55%) |
:arrow_up: |
src/memote/suite/cli/config.py | 100% <100%> (ø) |
:arrow_up: |
src/memote/suite/cli/runner.py | 51.93% <100%> (+0.24%) |
:arrow_up: |
src/memote/support/consistency.py | 96.89% <100%> (+0.04%) |
:arrow_up: |
src/memote/experimental/checks.py | 60% <0%> (-5%) |
:arrow_down: |
src/memote/suite/collect.py | 84.21% <0%> (+2.63%) |
:arrow_up: |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update b78de4c...d074a1c. Read the comment docs.
This would be really neat to make an attribute on the cobrapy.Configuration
class, we could then set it here through the command line option.
On the cobrapy side, it would also be really helpful to slightly modify FVA to handle the timeout case. Right now, for every reaction, the implementation only checks if the value is None
and that's not the case here, where the model.solver.status is set to time_out
but the value of the solution is still a float.
Probably it's better discussed in an issue on cobrapy but I don't fully see your point. In the line above that one
value = _model.solver.objective.value
if the solver experienced a time-out, this should be a valid value, shouldn't it?
Yes, it is a valid value but I am not sure if it should be considered a valid solution in the FVA loop or it should be discarded.
I think the test_model
test with a big model like you said would be problematic: the same test would potentially pass on Travis but fail on a local, faster, computer.
I believe that a better test would be to use a model that is known to get stuck on FVA. Another way is to set the timeout to 0 so another model wouldn't be needed.
I have been trying some things with the test but in the end I've just mark it as xfailed. Maybe the only option is to try to work with a very large model like you said @Midnighter.
[x] workaround #680
test_find_reactions_unbounded_flux_default_condition
in test_consistency can get stuck due to numerical instabilities while performing FVA.[x] Description of feature Add a parameter on the CLI to allow the user to provide a solver timeout value.
With this timeout, the test doesn't get stuck and continues. Whenever the solver reaches the timeout, it outputs an
UserWarning
. The problem is that timeout reactions won't be catch as Infeasible by the FVA implementation on cobrapy.[x] Decide to set a default value on timeout or temporarily set a value only for
test_find_reactions_unbounded_flux_default_condition
if none was provided.[x] tests added/passed
[ ] add an entry to the next release