Closed TomArrow closed 1 year ago
Another thing I noticed, though I think that's already like that in the base game, the demo headers written in CL_Record_f don't seem to write the serverFrameTime, but ParseGameState always tries to read it. Bit awkward, should maybe check if it's worth changing that, but I also understand if the goal is to stay in line with the original code.
Edit: I guess this wouldn't apply to MOHAA, but TA/TT?
And for MOHAA:
float MSG_ReadServerFrameTime_ver_6(msg_t* msg) {
return 1.f / atof(Info_ValueForKey(cl.gameState.stringData + cl.gameState.stringOffsets[CS_SYSTEMINFO], "sv_fps"));
}
It tries to read sv_fps from CS_SYSTEMINFO but it's actually in CS_SERVERINFO. Not sure if this would negatively affect anything but the serverframetime value does seem to be used for parsing animTime fields?
It should be good now, I was able to record and play a demo in mohaa and mohaab note that only demo files that match the current protocol can be opened
Changes look good yea. It did play them fine before I think (only tested mohaa) but I imagine it would lead to small details being wrong due to deltaing from something different and serverframetime not being set correctly.
The server frame time is not used on mohaa 1.11. Since mohaas 2.0, it is used so animations can be played without having to read the full float time (4-bytes). If the server didn't write the new animation time (because it didn't change much since the previous delta), then the client will compute the animation time by adding the server frame time to it.
In CL_Record_f, when writing the baselines, they are delta'd against a nullstate which is initialized as
Com_Memset (&nullstate, 0, sizeof(nullstate));
But it should probably be
MSG_GetNullEntityState( &nullstate );
instead, same as in SV_SendClientGameState.