JuliaStochOpt / StructDualDynProg.jl

Implementation of SDDP (Stochastic Dual Dynamic Programming) using the StructJuMP modeling interface
Other
23 stars 8 forks source link

Creation of AbstractStochasticProgram interface #14

Closed blegat closed 6 years ago

blegat commented 6 years ago

This big refactor improves the decoupling between the SDDP algorithm and the problem representation used by this package.

There is now 5 separate parts

codecov[bot] commented 6 years ago

Codecov Report

Merging #14 into master will increase coverage by 0.21%. The diff coverage is 91.66%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master      #14      +/-   ##
==========================================
+ Coverage   87.93%   88.14%   +0.21%     
==========================================
  Files          14       14              
  Lines         870      869       -1     
==========================================
+ Hits          765      766       +1     
+ Misses        105      103       -2
Impacted Files Coverage Δ
src/cutstore.jl 72.72% <100%> (ø) :arrow_up:
src/waitandsee.jl 91.07% <100%> (+0.16%) :arrow_up:
src/stochprog.jl 100% <100%> (ø)
src/interface.jl 100% <100%> (ø) :arrow_up:
src/sampler.jl 89.13% <85.71%> (+0.24%) :arrow_up:
src/nlds.jl 87.53% <87.5%> (-0.87%) :arrow_down:
src/graph.jl 90.76% <90.76%> (-0.14%) :arrow_down:
src/sddp.jl 83.87% <90.9%> (+0.53%) :arrow_up:
src/cutgen.jl 89.28% <92.85%> (+1.05%) :arrow_up:
src/path.jl 96.96% <93.75%> (-0.05%) :arrow_down:
... and 1 more

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update bece242...9931b19. Read the comment docs.

odow commented 6 years ago

I'm going to need some time to go through this and understand the internals of your implementation a bit better.

coveralls commented 6 years ago

Coverage Status

Coverage increased (+0.08%) to 88.014% when pulling c5462baeebb4a5b1e17b917a9ca14ebd6d6f202c on stochprog into 9cfc4a8d69c6a9230e34484955eef7037e743078 on master.

coveralls commented 6 years ago

Coverage Status

Coverage increased (+0.08%) to 88.014% when pulling c5462baeebb4a5b1e17b917a9ca14ebd6d6f202c on stochprog into 9cfc4a8d69c6a9230e34484955eef7037e743078 on master.

coveralls commented 6 years ago

Coverage Status

Coverage increased (+0.1%) to 88.042% when pulling fc99a5a868e62ecc2991e5dac77b3e9a5f12ee2b on stochprog into 9cfc4a8d69c6a9230e34484955eef7037e743078 on master.

coveralls commented 6 years ago

Coverage Status

Coverage increased (+0.02%) to 87.953% when pulling 9dafe369c3fba09a74ebcfef9f75452094821506 on stochprog into 9cfc4a8d69c6a9230e34484955eef7037e743078 on master.

coveralls commented 6 years ago

Coverage Status

Coverage increased (+0.2%) to 88.147% when pulling 8dff241698dec7caa4b6b84b5c2a4c6f9d177429 on stochprog into 9cfc4a8d69c6a9230e34484955eef7037e743078 on master.

coveralls commented 6 years ago

Coverage Status

Coverage increased (+0.2%) to 88.147% when pulling 8dff241698dec7caa4b6b84b5c2a4c6f9d177429 on stochprog into 9cfc4a8d69c6a9230e34484955eef7037e743078 on master.

coveralls commented 6 years ago

Coverage Status

Coverage increased (+0.2%) to 88.147% when pulling 8dff241698dec7caa4b6b84b5c2a4c6f9d177429 on stochprog into 9cfc4a8d69c6a9230e34484955eef7037e743078 on master.

coveralls commented 6 years ago

Coverage Status

Coverage increased (+0.2%) to 88.147% when pulling c9b60d1b4e83c120858506eae157c330b1d15efd on stochprog into 9cfc4a8d69c6a9230e34484955eef7037e743078 on master.

odow commented 6 years ago

a main issue is the way we model uncertainty : in this implementation we affect a proba for each edge outgoing a given node, if I understood well. I do not know if it is well suited if we have stagewise independency, that is, we have only a single node per time steps

Stagewise indepedent noises live inside a node. Nodes are connected by edges which have proabilities associated with them. The edge and noise inside a node are sampled independently.

odow commented 6 years ago

I think that the power of a generic implementation is that it is not specific to SDDP. We may also want to implement a proper Dynamic Programming solver, or other methods (such as the particle methods described by Dallagi, or spatial decomposition algorithms).

+100

coveralls commented 6 years ago

Coverage Status

Coverage increased (+0.2%) to 88.147% when pulling 2f2d3f2e6cc9fb4a8cb07a8629be1d731d32b5f8 on stochprog into 9cfc4a8d69c6a9230e34484955eef7037e743078 on master.

coveralls commented 6 years ago

Coverage Status

Coverage increased (+0.2%) to 88.147% when pulling 2f2d3f2e6cc9fb4a8cb07a8629be1d731d32b5f8 on stochprog into 9cfc4a8d69c6a9230e34484955eef7037e743078 on master.

blegat commented 6 years ago

a main issue is the way we model uncertainty : in this implementation we affect a proba for each edge outgoing a given node, if I understood well. I do not know if it is well suited if we have stagewise independency, that is, we have only a single node per time steps. Maybe we should implement uncertainty inside the node rather than in the edges, and point to the following node directly inside the definition of the uncertainty model. What do you think?

The interface tries to be agnostic on the particular representation of the stochastic problem. The probability(sp, edge) method has the advantage that it still makes sense for the markovian case. If you have stagewise independence, you will clearly not store internally a Dict{Edge, Float64} but rather do what you do in StochDynamicProgramming because that would be a lot more efficient. From the point of view of the SDDP algorithm the way the lattice/graph is represented does not really matter and probability(sp, edge) is as convenient as it could be right ? What advantage would we gain to constraint the interface to be less agnostic of the representation and closer to a simple stagewise independent structure ?

blegat commented 6 years ago

@frapac thinking a bit more about it, there is indeed an important issue here with cut sharing. If you generate an averaged optimality cut, you can share it with all the nodes that have the same children with same probabilities than you. We could use the noise node of @odow 's framework to model this. Even if the case of a graph, if two node share the same noise node, they will share the same cuts. In the algorithm, when we generate an averaged cut, instead of sending it to a specific node, we can send it to the noise node.

odow commented 6 years ago

I've moved away from explicit noise nodes to a hazard-decision and a decision-hazard thing https://www.sharelatex.com/project/59556d23c8f18fc31bc955d5

frapac commented 6 years ago

@blegat I agree with this particular issue. I think that your solution, combined with @odow 's combination of nodes and edges nodes, would solve this issue. But I hope that this implementation would not add a new layer of complexity ...

odow commented 6 years ago

If you generate an averaged optimality cut, you can share it with all the nodes that have the same children with same probabilities than you.

Better to solve all the children, then, if another node has the same children, you can recompute the averaged cut (using possibly different probabilities) without resolving.

blegat commented 6 years ago

Better to solve all the children, then, if another node has the same children, you can recompute the averaged cut (using possibly different probabilities) without resolving.

What would the interface be ? Something like the following ? 1) the SDDP algorithm informs the Stochastic Program that it has solved a set of nodes and gives the corresponding cuts. 2) Internally the Stochastic Program, hash the set of children to determines the nodes for which it can computes the averaged cut and gives them new cuts (or do something faster in case of more particular structure like stagewise independent without any graph structure).

odow commented 6 years ago

Different backward pass sampling schemes would do it differently.

One approach is to recurse back up the tree. After adding a cut to a node, check the other children of its parent to see if it is possible to add a cut. That will work best for lattice structures.

There are probably a number of optimizations and different heuristics you can use for deciding which nodes (and when) to add a cut.

coveralls commented 6 years ago

Coverage Status

Coverage increased (+0.2%) to 88.147% when pulling 9931b198603243ddf3147eedf45d67f1ce575448 on stochprog into bece242d2b362e0a47515b0c69f25644ab8d0877 on master.

blegat commented 6 years ago

I see, so you mean the interface should remain more flexible so that the SDDP algorithm can make the choice.

I will make this huge PR now, it does not mean that I consider the discussion closed but rather that we should transition the discussion to a StochasticProgram.jl package :)