Metaculus / metaculus

BSD 2-Clause "Simplified" License
52 stars 11 forks source link

Forecasts occasionally don't go through. #1262

Open nangchrys opened 3 days ago

nangchrys commented 3 days ago

Example question but can happen in any numeric questions, maybe date ones too.

Behavior: I click on "Save Change", it becomes grayed out, 3 dots appear below as if loading something, but less than a second later everything is as if I didn't predict and indeed my prediction hasn't gone through upon reloading. No visible error message.

Inspector message: image

Forecast that triggers it: image

The error is reliable; pressing the button again does not fix it. It can be fixed by adjusting any of the weight sliders, but appears again in different places. E.g. in the above screenshot, it happens if my bottom weight slider is at 100%, it stops if it's at 98%(?) but happens again closer to 95%. It also happens at ~50% and many other values.

No slider needs to be closer to 0, this forecast doesn't work either: image

lsabor commented 2 days ago

connected to #1210 First step will be better error statements A future implementation will guarantee that slider settings can't create invalid forecasts.

lsabor commented 2 days ago

Oh, and make those error statements actually appear upon prediction submission in the UI - not just in the debug console

lsabor commented 1 day ago

1210 is now in QA. You should have fewer bugs and better console log statements

I created a separate ticket for displaying the debug statements in the UI for users #1296

Minor change to cdf calculation that rounds to 10 digits might have solved the seemingly random validation errors you've documented.

I'm moving this to QA accordingly.