Panakotta00 / FicsIt-Networks

Control, Monitor, Manage and Automate your Satisfactory.
https://ficsit.app/mod/FicsItNetworks
GNU General Public License v3.0
156 stars 51 forks source link

computer.time seems to be off #200

Closed abesto closed 1 year ago

abesto commented 2 years ago

https://docs.ficsit.app/ficsit-networks/latest/lua/api/Computer.html says this about computer.time:

Returns the number of game seconds passed since the save got created. A game day consists of 24 game hours, a game hour consists of 60 game minutes, a game minute consists of 60 game seconds.

https://satisfactory.fandom.com/wiki/World says:

One day on Massage-2(A-B)b lasts for 50 real-world minutes

Based on these, and stretching my remaining memories of how equations work to their limit, I'd expect one real-world second to be roughly 28.8 in-game seconds:

50 real-world minutes = 1 in-game day
50 * 60 real-world seconds = 24 * 60 * 60 in-game seconds
50 real-world seconds = 24 * 60 in-game seconds
1 real-world second = (24 * 60) / 50 in-game seconds = 28.8 in-game seconds

So running the below code:

start_save = computer.time()
start_cpu = computer.millis() / 1000
while true do
  event.pull(1.0)
  save = computer.time()
  cpu = computer.millis() / 1000
  delta_save = save - start_save
  delta_cpu = cpu - start_cpu
  ratio = delta_save / delta_cpu
  print(delta_save, delta_cpu, ratio)
end

I'd expect to see the numbers in the last column to be 28.8 +/- 0.1 or so. Instead, they're around 11 +/- 0.1 or so:

11.0 1.009 10.901883052527
22.0 2.02 10.891089108911
32.5 3.021 10.75802714333
43.5 4.022 10.815514669319
54.5 5.023 10.850089587896
65.5 6.035 10.853355426678
76.5 7.036 10.872654917567
87.5 8.036 10.88850174216
98.5 9.038 10.898428855942
109.5 10.048 10.897691082803

Am I missing something? Am I just doing a dumb-dumb? Or is something off with the timing functions? FWIW, computer.millis() / 1000 does seem to tick over once every real-world second, so if there's something wrong, then it must be computer.time.

abesto commented 2 years ago

Ooooh my god. You know how days in the game are longer than nights? The way that seems to be implemented is NOT by setting sunset to be late and sunrise to be early. It's implemented by speeding up clocks during the night. The same code, run at night, gives output like this:

91.5 1.011 90.504451038576
182.5 2.023 90.212555610479
274.5 3.047 90.088611749262
365.5 4.058 90.068999507146

I'm... honestly unsure at this point about how to deal with timekeeping :D Specifically: it seems (close to) impossible to figure out how many real-time seconds passed between two in-game timestamps. This makes any timing-aware applications (like, "calculate storage change rate over the last 10 mins") really hard to implement to work correctly if the computer / (or even the game) restarts. I can store the timestamp on disk along with a dump of my database, but have no way of knowing at startup how many real-world seconds ago that timestamp was. (Unless, that is, I build in further knowledge about the way time passes, but that knowledge is not documented anywhere AFAICT: the hour at game start, and the time of sunset/sunrise)

RozeDoyanawa commented 2 years ago

computer.time is designed to return the ingame time. And to not break immersion, there will be no built in irl-time. You can use millis + internet card to make your own implementation tho.

abesto commented 2 years ago

And to not break immersion, there will be no built in irl-time.

I buy that. I don't need to know what o'clock it is in the real world.

I do need an in-game clock that ticks over once every real-world second. Why? Because all in-game productivity displays use real-world seconds, not in-game seconds. So I want displays I build to use the same system to talk about productivity metrics. All "items per minute" use real-world minutes, and there is currently no way to get real-world seconds monotonic in the life of a save. You can get real-world seconds that are monotonic in the runtime of a computer (i.e. computer.millis()), but it is currently impossible to tell how many real-world seconds passed between a computer stopping and starting.