Closed EHadoux closed 10 years ago
At the moment policies are instead still linked with inheritance. This is mostly because each policy class is very similar to the other ones and they reuse lots of functionality, so there's little reason not to do that. However I'm not a true C++ expert myself so maybe one day that will also change.
There isn't all that much Boost, is there? What are the parts that trouble you most? If you have any problems contributing just send me an email or write something here, I can help, or maybe write more docs where they are needed =)
Thanks for the comments!
Thanks for the clarification. It makes sens.
Actually you're right, it's not Boost but stuffs like std::is_same<decltype(test<M>(0)),std::true_type>::value && is_generative_model<M>::value
that frightens me, there is too much template and namespace I can handle for now :). However, i'll need one day to dive into it to use the new version of the lib (I think this version is not compatible anymore with the one I use).
Yes, unfortunately I do tend to break things, but I'm not that experienced so I'm learning things as I go, and I hope that each change makes the whole better for it.
Those parts are in general to simplify compiler errors for users (look up SFINAE), it's not really meant to be touched all that much! It's so that you can have template requirements specified in a simple way (so I can say, for example, that in an algorithm that uses an MDP that the MDP class will allow for certain methods - kind of an abstract class). But sure, take your time, there's still much that I have to do anyway =)
I agree that hard inheritance can be restraining, i.e. if POMDP > MDP, it will be hard to add a class inbetween. However, if for some reasons I want to solve an MDP with POMCP, I cannot (due to the sampleSOR in simulate that doesn't exists in MDP). In either case, there is downside. I don't know if some kind of Java Interfaces, like partially observable, completely observable, mixed-observable etc., would work in order to specify more like an authorized behaviour than a strict model.
if for some reasons I want to solve an MDP with POMCP, I cannot
Except by transforming the MDP to a POMDP of course.
Well POMCP is an algorithm to apply Monte Carlo Tree Search to POMDPs, so it makes little sense to apply it to MDPs. If you need Monte Carlo for MDPs, might as well use MCTS (coming soon btw). Edit: MCTS is now online for you to try out =)
Hi,
I follow this work from the beginning and it becomes more and more interesting. Well done. However, I got two questions:
I will sadly retain myself to contribute as there is too much Boost for me to be plainly efficient and sure to not break anything. Thank you and keep up!
Emmanuel