google-deepmind / torch-distributions

Probability distributions, wrapped for Torch.
BSD 3-Clause "New" or "Revised" License
61 stars 25 forks source link

Missing support for cat.rnd(p) with p a matrix whose rows define category probabilities #23

Closed gdesjardins closed 10 years ago

gdesjardins commented 10 years ago

This would be a convenience method. If p is a NxM tensor, the result would be to generate N independent samples from N categorical distributions. The i-th sample would be obtained with probabilities given by the vector p[i].

Use case: sampling from the output of a neural network having a softmax output layer. N in this case corresponds to the size of the minibatch. When used in conjunction with torch.nn, this would allow for the following:

% x: torch.Tensor sampledOutputs = distributions.cat(module:forward(x))

jucor commented 10 years ago

Doable :) Should the result be a N-by-1 matrix (which allows for possible N-by-K if you want K iid sample from each of the N distributions), or do you prefer a N vector of length N ? Or do you want it to depend whether you specify a K ?

gdesjardins commented 10 years ago

I like the idea of returning an NxK when a K is specified. As for Nx1 vs. vector of length N if K=1, I guess I'm ambivalent. Depends if there's standard/convention in place.

Alternatively, having a one-hot option for neural-nets (only valid with K=1) would be nice. The return value would be an NxM tensor, with a single 1 per row (corresponding to the sampled category). This might be a whole other function though...

jucor commented 10 years ago

And voila :) Check out the doc. I've named the vectorized categorical sampling mvcat, but I'm open to better names: any suggestion?