ocbe-uio / BayesMallows

R-package for Bayesian preference learning with the Mallows rank model.
https://ocbe-uio.github.io/BayesMallows/
GNU General Public License v3.0
21 stars 9 forks source link

Heat plot is wrong with thinning #381

Closed osorensen closed 9 months ago

osorensen commented 9 months ago

Seems like we're dividing by the nmc parameter provided in compute_options and not by the actual number of stored iterations. Thanks to Marta for discovering.

library(BayesMallows)
set.seed(1)
mod <- compute_mallows(
  data = setup_rank_data(potato_visual),
  compute_options = set_compute_options(nmc = 10000, burnin = 400)
  )

plot(mod, parameter = "rho", items = 12)

heat_plot(mod)


mod <- compute_mallows(
  data = setup_rank_data(potato_visual),
  compute_options = set_compute_options(nmc = 10000, burnin = 400, rho_thinning = 10)
)

plot(mod, parameter = "rho", items = 12)

heat_plot(mod)

Created on 2024-02-16 with reprex v2.1.0

osorensen commented 9 months ago

Good now

library(BayesMallows)
set.seed(1)
mod <- compute_mallows(
  data = setup_rank_data(potato_visual),
  compute_options = set_compute_options(nmc = 10000, burnin = 400)
  )

plot(mod, parameter = "rho", items = 12)

heat_plot(mod)


mod <- compute_mallows(
  data = setup_rank_data(potato_visual),
  compute_options = set_compute_options(nmc = 10000, burnin = 400, rho_thinning = 10)
)

plot(mod, parameter = "rho", items = 12)

heat_plot(mod)

Created on 2024-02-16 with reprex v2.1.0

wleoncio commented 9 months ago

osorensen commented 9 months ago

Look at the probabilities legend at the lower plot

osorensen commented 9 months ago

Before

image

After

image
wleoncio commented 9 months ago

Oh good, thanks for pointing that out. I was obsessing with the plots and totally forgot to look at the legends.