Closed nsmryan closed 7 years ago
Thanks for this. I will take a look over the next few days.
I have just noticed the same thing. From source diving, I think the description is anyway wrong for both -- neither weightedCategorical
nor categorical
actually seem to assume that the weights sum to 1. (For both, during sampling, a random number between 0 and the sum of all the weights is chosen uniformly, whatever that sum may be.) The difference between the two appears to be that one removes zero-probability events and the other doesn't; the re-weighting so that weights sum to 1 may be worth mentioning but should only matter if the underlying probability type's uniform distribution doesn't respect scaling.
Hi, Dominic, and happy new year!
I just stumbled upon this documentation issue, too (and probably not the first time...)
Could you fix it up?
I am looking now...
Ok I pushed a documentation change.
BTW I am not totally convinced of performance here. I haven't investigated the existing code but it can be quicker to use the inverse CDF with a uniform sampler and it's not obvious to me that this is being used in the code.
The documentation string for weightedCategorical says that the weights must sum to 1. I believe that this is a copy paste issue, since weightedCategorical seems to normalize the weights, and would otherwise be identical to categorical. This applies to weightedCategoricalT as well.