I watched a recent YouTube video about deltaTime and at around 3:50 the author talks about how it is calculated. He goes on to claim that it is NOT
The time elapsed between the frame currently being processed and the one preceding it
but rather is
The time elapsed between the last frame and the frame preceding it
And this confused me. The video's explanation of the reasoning also didn't help but just raised more questions. So now I'm here. :)
Let's set up a timeline. Since I'm too lazy to draw, I'll use numbered bullet points:
- Frame 1 starts
Update()
is called. It checks the inputs and updates the game state.Render()
is called. It renders the game state into a frame.DisplayFrame()
is called. It swaps out the previous frame for the new one. Depending on vertical sync/double-buffering/triple-buffering settings, this might also add some delay until the new frame is actually displayed. Or not.- Frame 2 starts.
Update()
is called. It checks the inputs and updates the game state.Render()
is called. It renders the game state into a frame.DisplayFrame()
...- Frame 3 starts
Update()
...Render()
...DisplayFrame()
...- Rinse and repeat.
Now (ignoring the very first frame ever) I'd expect that the deltaTime
passed into Update()
at step 10 would be the difference between timestamps taken at points 5 and 9. However that would fit the description of "Time elapsed between the frame currently being processed and the one preceding it". The "time elapsed between the last frame and the frame preceding it" would be the difference between points 1 and 5 which to me doesn't make sense.
Alternatively, we could say that the "time of the current frame" is actually when it's displayed, so right after DisplayFrame()
. In this kind of sense, yes, the difference between points 5 and 9 could be considered "the time between the last frame and the previous one".
However that interpretation comes with another scary implication - it means that Update()
should actually try to predict the future. I always thought that Update()
tries to calculate what the game state looks like NOW. When a frame starts, Update()
has the previous game state and knows that since then deltaTime
time has passed. So now it has to calculate how the game state has evolved, but the end result is the game state how it should look like NOW, not deltaTime
in the future.
Calculating the future however means that you need to do all kinds of compensations when the frame rate varies, which gets pretty hairy fast. But the video does not speak of this, so I don't think that's what it meant either.
Of course, my interpretation (calculating the NOW, not the future) also has drawbacks - it means that what the user sees on the screen is always in the past.
So how is it really? Between what points is deltaTime
being actually calculated and how should Update()
interpret that information?
Note: this question isn't engine-specific, but I realize that the answers might differ in various game engines (or maybe not). If there is a difference, please specify which engine does what.
Between what points is deltaTime being actually calculated and how should Update() interpret that information?
No matter the exact moment its calculated, it'll always be the time between the last frame and the frame before that. \$\endgroup\$I'd expect that the deltaTime passed into Update() at step 10 would be the difference between timestamps taken at points 5 and 9.
It might be.However that would fit the description of "Time elapsed between the frame currently being processed and the one preceding it".
It actually wouldn't. The frame currently processed is complete at step 12. You're not at step 12 yet and you don't know when step 12 will be in the future. You only know that the previous frame completed X milliseconds after the frame before that. \$\endgroup\$