pytorch / pytorch

Tensors and Dynamic neural networks in Python with strong GPU acceleration
https://pytorch.org
Other
81.94k stars 21.98k forks source link

Add Plackett-Luce distribution #38684

Open agadetsky opened 4 years ago

agadetsky commented 4 years ago

🚀 Feature

Add PlackettLuce and RelaxedPlackettLuce distributions. It is a simple distribution over permutations.

Motivation

For optimization over categorical/binary variables (i.e. variational inference with discrete variational approximate posteriors, RL and etc) we already can use RelaxedBernoulli /RelaxedOneHotCategorical and Bernoulli/OneHotCategorical depending on if the method is relaxation based or not.

Also, the same Gumbel-Softmax trick holds for the distribution over the set of permutations - Plackett-Luce distribution. You can see successful applications in https://arxiv.org/abs/1911.10036, https://arxiv.org/abs/1903.08850.

Pitch

Tensorflow has tfp.distributions.PlackettLuce. I think it will be very cool to have it in the pytorch.

Additional context

For the efficient implementation we will need numerically stable implementation of LogCumSumExp function which is almost finished in PR https://github.com/pytorch/pytorch/pull/36308

EvanZ commented 3 years ago

I was looking to see if this was implemented, but apparently not. I did find a project that implements PL distribution in Torch, so I will try that. https://github.com/agadetsky/pytorch-pl-variance-reduction

jeremysalwen commented 3 years ago

We have an implementation of the Plackett-Luce distribution supporting batching with permutations of varying length. I'd be happy to contribute it to pytorch if there is interest.

https://github.com/JDBumgardner/stone_ground_hearth_battles/commit/0137be0ff53c6e092c741e9b4c06986b9f7de971