Open Sponk opened 8 years ago
Hm, is system time really the best way here? Aren't floating point numbers usually 32 bit on the GPU, so as time passes, system time would lose precision because you're probably converting from double precision to single precision. Perhaps sending delta time could be better?
The advantage though is that you can get the animation frame with system time by a simple math formula without having to store any state about which frame was the last one. But yeah I'm not entirely sure about it.
The problem with just passing the delta is that the shader can't store any data, that means it is impossible to use this to progress animations. The time should also be passed in as an 32bit integer without involving any floating point operation. The time would then be guaranteed to be (millis % 2^32) at least which should be a big enough time frame for any animation.
Yeah so perhaps instead of using actual "system time" it should be a normalized value which flows same as system that also loops around when it gets close enough to overflow.
What do you mean by "normalized"? It should get the number of milliseconds passed since the application was launched in my opinion.
Yeah it's actually no problem, I was thinking since it's milliseconds it might eventually overflow but I did the math and it would be able to run for a few years before that happens.
By normalized I meant something that gracefully loops around and stuff, but I guess you don't need that :new_moon_with_face:
This allows animations to happen in GLSL.