[ad_1]
I’m working on a videogame where part of the logic involves increasing a value each render frame (a render frame is equivalent to a simulation step where the result is drawn on the screen).
Frames can have varying time steps between them, commonly named “delta time”.
I am wondering if there is a general approach to determining whether the expression that updates the value is “pinned” to real time, meaning that running one frame with delta time x
is equivalent to running n
frames with delta times adding up to x
.
I have thought about this extensively for a few days now, but haven’t reached the clean generalization I think would be possible to reach.
Say every frame we update value a
with the following expression:
a = a + (b - a) * d
This is a commonly used formula for linear interpolation, where b
is a constant that can be interpreted as a target value, and d
is another constant between 0 and 1 that represents the fraction of distance to cover.
Now as you can see in the link I shared above I have calculated that to do this independently from frame rate one has to replace the factor d
with an expression that involves delta_time
:
a = a + (b - a) * (1 - d^delta_time)
I did this by expressing this process as a recursive sequence:
S(0) = a
S(n) = S(n-1) + (b - S(n-1)) * d
Then making it non-recursive (I don’t think there’s a standard approach for this, just expanding the expression and looking for patterns such as series with known formulas for the sum of the first n
elements).
Once I reached a non-recursive form, I replaced n
with t / delta_time
, as the number of frames elapsed is equivalent to time elapsed divided by the time between frames. Then, I introduced delta_time
into the expression in order to cancel it out, and translated that change into a change in the d
constant.
I am not happy with this approach however.
I am now trying to formulate the problem in some other way, seeing if it gives me some clues as to a clean generalization.
What I’m looking for is a simple rule, a way to express that:
Given an expression S(delta_time)
which calculates how much to increase a value by each frame, an increase of x
in delta_time
should be equivalent to running one iteration with delta_time = x
.
I would also be interested in a solution that would provide assurance in terms of varying time step between frames – so not just agnostic to different delta_time
over n
frames, but agnostic to any delta_time
between each subsequent frame. I think this may have no importance mathematically, but I’m not sure.
Could someone help elucidate me? I feel I’m at the edge of my mathematical ability.
Thanks!
[ad_2]