I'm not really sure if this is the right place to post a question like this, although the question itself is really more of a maths thing and I couldn't find anywhere else on the web to ask, so here goes. I'm currently making my way through a book about game programming using the SDL library. It's going great, or it was until I came across this little bit of code:
The code moves g_sourceRect horizontally (g_sourceRect.x)along the sprite sheet to each individual frame of the animation. So if each sprite had a width of 90 it would do something like this: g_sourceRect.x += 90; Surely? Nope. Doesn't work. (That wasn't my question btw.)
SDL_GetTicks() is a function that returns the number of milliseconds since the library was initialised. Now after this is where i get confused. Apparently you then divide it by the amount of time you want between each frame in milliseconds (so 100 in this example). Umm... what? How? And then the modulo operator is to keep it in range of the amount of frames in the animation.
So basically after all that my question is... How on earth does this math work? I get what it's doing but not how.
Apparently you then divide it by the amount of time you want between each frame in milliseconds (so 100 in this example). Umm... what? How?
That's kind of how division works. You can plug in the units and use algebra to prove it:
SDL_GetTicks gives you milliseconds (m)
You want frame (f)
100 is our milliseconds per frame (m/f)
1 2 3 4 5
SDL_GetTicks / 100 = frame
m / (m/f) = f
m * (f/m)
mf/m = f
f = f
Or... you can plug some numbers in to see it visually:
SDL_GetTicks starts at 0. 0/100 is still 0, so at tick=0 we will display frame 0
All the way through tick=99.... 99/100 is still 0, so at tick=99, we will display frame 0
Once you hit tick 100... 100/100 is 1, so at tick=100 we will display frame 1