JuliaStats / Klara.jl

MCMC inference in Julia
Other
166 stars 38 forks source link

Add references to papers that describe the samplers #130

Open davidanthoff opened 8 years ago

davidanthoff commented 8 years ago

It would be great if the documentation just had 1-2 references to papers for each sampler. A second step would probably be to describe each sampler in the documentation, but that seems more work, and just a few references without any explanation would already help a lot. For example, I kind of assume the RAM sampler refers to this paper, but I'm not sure. Just a short link would clarify these things.

papamarkou commented 8 years ago

Thanks @davidanthoff. Firstly I agree with you that documentation has fallen behind. I pretty much finished coding Gibbs sampling too (in the master), so the next planned step is to update documentation and port the rest of samplers, it will be taken care of.

About how to write up the documentation, I am open to suggestions. Basically, I just signed a Springer contract last week to write up a book on Monte Carlo methods with Julia, which means that Lora's samplers will be more than well documented in the book. I certainly don't want to end up writing up the documentation from scratch re-iterating the book, so it may be sensible to just provide a reference to the paper used for each sampler as you suggested, now that I will start updating the docs? What do you think from the user's point of view, would that suffice?

davidanthoff commented 8 years ago

Wow, that is great news (the book)! Very much looking forward to this.

I think in that case the optimal amount of documentation, in stages would be: Stage 1: Simply references to papers for each sampler/method, without any further explanation. That is just important for us if we want to use this for stuff we want to publish, to make sure we know exactly what we are using. Stage 2: API documentation, i.e. what options are there etc. But again, without any explanation of the underlying theory/math/approach. Think target audience someone who knows MCMC methods really well and just wants to map his/her knowledge onto your API.

And I think then the book would probably go in-depth into the various methods, explain them etc., right? If there is such a book, I don't think you need to have that material in the online documentation.

papamarkou commented 8 years ago

Yes, I agree with you, in principle the documentation to some extent becomes redundant. There are only 3 considerations; i) it would be nice for the community to be able to use Lora even if they don't wish to buy the book (since the goal is to offer free software to the community), ii) the book has a deadline to be completed by end of September 2017, so we want to make sure that users can use Lora before then...!, iii) the options and flexibility of Lora grows to the point that even I tend to forget sometimes all that has been done and need some sort of quick reference.

Your plan sounds good, I will stick to it, to try and finish ideally by end of this month with a very quick update of documentation without going into depth. At least stage 1 should be fast. The API documentation may take me a bit longer, balancing it with teaching and other academic duties going on...

davidanthoff commented 8 years ago

Great, much appreciated!

davidanthoff commented 8 years ago

Oh, and is the RAM algorithm the one described in the paper I linked to at the top?

papamarkou commented 8 years ago

Yeap, Vihola introduced the RAM algorithm, that's the right reference :+1:

davidanthoff commented 8 years ago

Thanks!