moves-rwth / stormpy

Python Bindings for the Probabilistic Model Checker Storm
https://moves-rwth.github.io/stormpy/
GNU General Public License v3.0
29 stars 16 forks source link

Example for sampling from a MDP + Scheduler #14

Closed sjunges closed 2 years ago

sjunges commented 4 years ago

As pointed out in: https://github.com/moves-rwth/storm/issues/66, it would be nice to have an example that shows how to create a MDP, compute an optimal policy, create the induced DTMC and then sample paths from this DTMC. The paths should contain the variable valuations from the original prism model.

I will work on this.

sjunges commented 4 years ago

There is some progress:

One can sample from DTMCs or MDPs using our new simulator, see DTMCs and MDPs for two usage examples. The first also reports how to obtain the prism-variables.

To simulate from an MDP with an optimal scheduler, first compute this scheduler and apply the scheduler, as shown in Compute and Apply Scheduler. Then sample as above (notice that although all nondeterminism is gone, storm still reports that this is an MDP. This is part of the issue above.

sjunges commented 2 years ago

Compute and apply a scheduler now results in a DTMC if hte scheduler is fully observed. If not, one can also use the scheduler during simulation. This is still a bit iffy though, but beyond what was asked for.