Open GoogleCodeExporter opened 8 years ago
I'm sure some of my gamerate/TAS detection code is still in there. We can use
that. (Basically, it would mark a game as a TAS when it was played below 95%
normal rate on average.)
Original comment by Zirc...@gmail.com
on 1 Nov 2010 at 4:02
That also sounds good.
I'd suggest a simple integrity check regarding the actual real-time of a game,
without getting into high precision timers and stuff like that. Just take a
normal time stamp at the start and end of the game (with simple precision of
some 10ths of ms, e.g. of the OS scheduler) and check if the actual time played
differs too much from the calculated game time (meassured with frames), e.g. by
more than 100ms.
This should be enough to detect most games, which would produce invalid
high-scores by playing at a low frame rate.
Original comment by bob.ins...@gmail.com
on 6 Nov 2010 at 3:19
That's basically what my TAS detection does. 100ms isn't a very wide margin
though, I've had replays save at 98% rate (which is very much larger than 100ms
difference) so something that small would probably not be very good.
Original comment by Zirc...@gmail.com
on 8 Nov 2010 at 8:16
hm interesting to know - if you say > 100ms is not very uncommon, the current
timer is imo not usable for timed highscores on a real leaderboard then.
I mean if someone easily can get a time that is .1 sec better (and won't even
feel much/any lag, cause the framerate is still about 60Hz), just by running
the game on a slower computer, that should not be allowed, por at least not
regarded as a valid record, e.g. for 40L.
Original comment by bob.ins...@gmail.com
on 8 Nov 2010 at 9:51
This gives me an idea: Record the system ticks at start, and record the system
ticks at every 60th frame. During replay playback, add a second FPS counter
above the usual one to display what FPS the replay was recorded at, by taking
60/(timestamp[t]-timestamp[t-1]) and updating it every 60th frame.
This would allow you to see just where the replay was slowed down, since a
constant 57 FPS is a different story than playing the first 90% at 60FPS and
the final 10% at 30FPS, but both would have an average rate of 95%.
Original comment by Poochy.EXE
on 9 Nov 2010 at 1:48
[deleted comment]
@5: I like that idea a lot. This would make potential new high-score replays
really easy to verify.
The problem with games that are unintentionally played at a slightly lower
frame rate still exists tho, and should be addressed. A short example with
numbers:
Let's say the new legit 40L record on the leaderboard is exactly 25.000 sec (=
1500 frames) and was played on a normal machine, which can easily handle 60 fps
at all times. Now someone plays on a netbook for example, and pretty steadily
averages the mentioned 57 fps, this means his game still runs about as smoothly
as anybody elses. A game is played on that computer only using 1490 frames,
i.e. 24.833 sec in game-engine time! With a rate of 57 fps this game actually
took 26.140 sec tho, which surely shouldn't be regarded as a new 40L record.
Original comment by bob.ins...@gmail.com
on 9 Nov 2010 at 2:30
also, what I forgot to mention: this is of course not only an issue for online
leader-boards and their integrity (as I said, if people want to find a way to
insert faked scores, they probably always will), but also for everybodys own
personal high-scores played in offline mode.
Looking at the above example, even tho I'm not a fast player at all, how can I
trust my new personal best / record, if I'm not playing on a machine, where I
can guarantee that I have 60 fps 100% of the time? Since we're easily talking
of margins > 100ms or even whole seconds here.
Original comment by bob.ins...@gmail.com
on 9 Nov 2010 at 2:38
Just wanted to check if there are any plans to address this issue somehow in
the near future, just discussed this with paradox again.
Detecting slowdowns/speedups with the new percentage display is fine, but
doesnt really fix the issue that the clock is off and your own timed records
can't really be trusted.
I tried a couple different fps limiter settings on config page 3, but still get
103-105% etc.
Fixing this is probably not something for 7.5, but I think changing the way
timing is handled (not fps based) should be addressed sometime.
Original comment by bob.ins...@gmail.com
on 4 Jan 2011 at 5:12
nullpomino should skip frames instead of slowing down, this will fix most of
issues.
TAS detection / game rate you talking about is really pointless, because if
somebody want to cheat, he will most likely slow down real time clock with
cheat engine.
Original comment by w.kowa...@gmail.com
on 4 Jan 2011 at 10:47
I made two meassurements, recording the sound for 40L games, to measure the
approximate real-time of these game, while having a 100% stable system during
the recording. (in contrast to video mostly)
I took the time of the audio recording as a lower boundary, meaning the game
must have lasted at least that long. Starting from the beginning of the first
preview piece sound (after the two beeps), and ending with the beginning of the
last line clear "crash" sound, ofc using no line clear delay.
The settings (page 3) I used were:
(I) vsync: on, sleep timing: render, dynamic adjust: off, perfect mode: on,
perfect yield: off
(II) vsync: on, sleep timing: render, dynamic adjust: on, perfect mode: on,
perfect yield: off
All on slick. Judging from the descriptions these options should be the 2
candidates for the mose accurate fps limiting, granting a capable machine
running the game.
Results:
(I) showed a time that was 0.10s faster (~ 6 frames) than the realtime of the
sound recording. (the rate/% display showed 103% btw, which makes no sense in
this case, because the game didn't ran faster but actually slower apparently,
judging by the ingame time)
(II) showed a time that was 0.08s faster (~ 5 frames) than the realtime
System specs:
3GHz Phenom II Quadcore, 4 GB RAM, ATI 4870
We just discussed this issue on IRC a bit.
Wojtek proposed the implementation of a frame skipping mechanism, to improve
accuracy of the timer. (just saw that he posted about it himself) While I agree
that his is a good idea and probably a fix for low-end machines that can be
implemented without creating too much headache for adjusting the rest of the
code relying on the timing engine of the game (such as replays, etc), I don't
think it will address the problem that currently exists for mid- to high-end
machines, which are capable of rendering a lot more fps, and thus would never
skip a frame in the first place.
We can see from the results above that these machines, which should be expected
to give the player accurate timings for his records, cause they are fully
capable of running the game as good as they can, still cause inaccuracies in
the range of 0.10 seconds or maybe more.
We also discussed the issue of replays a bit and the problems that exist with
it, if they wouldn't be recorded or played at 60fps.
Since it would not change the behaviour of how replays are played at all (just
do this the same way), I propose the addition of high precision/resolution
timers (as they are commonly used for game development) to get improved game
times, which can be accurate with a precision of at least 1ms on a normal pc
system. I used them in C++ for both windows and linux before, never in java
though, but I do know they exist there. (maybe System.nanoTime() is good?)
I can't vouch for their precision in the area we would need, but I think they
should be more than fine, because we are just talking about an end-time we show
down to 10ms increments, which is easily achievable for any real-time
measuring, even when done on a pc system (which is not designed for this task
at all actually), if it does not go through the os scheduler ofc.
I think it is worth to look into this option to provide more accurate timings
and therefore records in the long run. Regarding the issue of replays, I would
play them just as it is done now, meaning if it is recorded at lower fps, and
watched on a more powerful system, it just runs faster. If timestamps would be
saved with every frame or just key frames for that matter, it would be possible
to display a more or less correct clock on the side, not in a sense that the
clock would run right on the more powerful system that watches the replay, but
actually giving the timings that occured when the replay was recorded, which in
my opinion is what matters here. There is definitely no need to try to adjust
playback of the replay that it somehow would match the original speed, how it
was recorded, which is most likely way too much effort for no real gain here.
I can't evaluate how much issues this cahnge would create for other parts of
the game that rely on the timing engine, but it could be kept minimal when it
would be implemented by just getting the clock/time data from a high reolution
timer and keeping the rest as is.
Original comment by bob.ins...@gmail.com
on 4 Jan 2011 at 11:24
The correct solution is to have the game run as fast as it can, and be
framerate-independent. That is, if you are running at 57fps, 60fps, even 2fps,
the timer will be accurate.
Unfortunately this is a major overhaul, that would break all replays, and a lot
of things rely on 60hz (e.g. tgm modes, all DAS/Settings are expressed in
"frames")
Original comment by the.o...@gmail.com
on 4 Jun 2015 at 7:41
Original issue reported on code.google.com by
bob.ins...@gmail.com
on 31 Oct 2010 at 8:34Attachments: