Closed dgant closed 4 years ago
Thanks @dgant for bringing up this issue and @chriscoxe for your fix! I think Chris' solution is as accurate as we can get, if I understand well what's happening with BWAPI. A module bot would have to take the same amount of time (according to getLastEventTime()) for each event on a frame to be undercounted with this change, which is unlikely. I'll do a test with it and then pull it soon.
Follow up @dgant - I tested the update from Chris, which is now pulled, with the 440 bots currently on the ladder and actually found no change in the number of timeouts for PurpleWave. I was surprised so I made some different versions that output some debugging info in the gamestate.txt and confirmed it was working correctly.
440 client bots were always showing the same event times for every event in a frame, as expected. So only the time from the first event called on a frame was being counted for those bots.
Also, when I timed the frames in the Tournament Module using the Timer class, timing from onFrame() to onFrame(), each frame where PurpleWave was measured by BWAPI as over 55ms, the Tournament Module also measured over 55ms. Obviously onframe() to onFrame() is measuring a little bit extra, and would be wrong if the other bot goes over time, but it is pretty close to the BWAPI time for each frame.
So I'm satisfied that the frames are actually going over time (now).
which is now pulled
Quick clarification: Is it running live on the AIIDE ladder right now?
Yes it is!
Since I made the update to the repo.
BWAPI 4.4 provides very different timing information for client and module bots. This bug causes the current 4.4 Tournament Module to miscalculate the amount of time a client bot has spent running, usually a vast overestimate.
Empirically I've observed this happening to PurpleWave on the AIIDE ladder. In one game where it got disqualified after 7:29 game minutes and under 4 real-time minutes: