Game Timings

Hello cplusplus community,

This is my first post! Sorry if I did anything wrong!

I am making a game using WinApi and DirectX. I have made pretty decent progress and I like where the project is going, however I have stumbled upon something. I was wondering how to accurately time the events in a game. Not the FPS, but the actual calculations and such. I wonder about this since some computers run faster than others, wouldn't the game calculate faster on different computers? If I wanted a universal experience, wouldn't I want all of the users computers to do the calculations at the same speed?

I have already tried time.h, but it isn't accurate/fast enough. Any thoughts? Or am I just being a noob!

Thanks,
MaxterTheTurtle
closed account (3TXyhbRD)
Using a timer can help. What went wrong with time.h?
Try using QueryPerformanceFrequency() and QueryPerformanceCounter():

http://msdn.microsoft.com/en-us/library/windows/desktop/ms644905%28v=vs.85%29.aspx
http://msdn.microsoft.com/en-us/library/windows/desktop/ms644904%28v=vs.85%29.aspx

EDIT: You can find some DirectX timer implementations that use these functions in the code samples that are available in the following site (these are code samples from Frank Luna's books on DirectX):

http://www.d3dcoder.net/
Last edited on
For instance, I want it to operate at roughly 250 ticks a second, but the closest I can get the calculations is only 17 ticks a second. At least on my computer.

Here's my code:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
#include <iostream>

#include <time.h>

int main ()
{

	int i_ticks = 0;

	float
		f_secc = clock(), /* used to hold the first millisecond*/
		f_secn = clock() + CLOCKS_PER_SEC, /* used to  hold the next second */
		f_milc = clock(), /* used to hold the current millisecond */
		f_miln = clock() + (CLOCKS_PER_SEC/250); /* used to determine the next tick */

	do
	{
		f_milc = clock(); /* get the current millisecond */

		if (f_milc >= f_miln) /* if 250th of a second passed */
		{
			i_ticks++; /* increase the tick counter */

			f_miln = clock() + (CLOCKS_PER_SEC/250); /* assign when the next tick occurs */
		}

	} while (f_milc <= f_secn); /* break when a second passed */

	std::cout << i_ticks << std::endl; /* output the amount of ticks registered */

	std::cin.get();

	return 0;
}


This is really rough and isn't part of my project. I just needed something to test with. Also this code is assuming that CLOCKS_PER_SEC is 1000.

I know I am doing something wrong then. Anyone have any suggestions?
Last edited on
@Ogoyant

Thanks for the reply! I didn't see yours before I posted mine. Those functions work beautifully!

I am wondering though, I am correct by saying that QueryPerformanceFrequency() returns the amount of ticks that occur over one second? And QueryPerformanceCounter() returns the current tick?

Also, the only information that I seem to be able to pull out of LARGE_INT is the LowPart(). Is this what I am looking for?

Edit: Or is it QuadPart()?
Last edited on
@MaxterTheTurtle

You're welcome! I'm glad they helped. Yes, QueryPerformanceFrequency() returns the amount of counts per second. This function should be called when your timer is initialized, since the value it returns cannot be changed for the duration that the system is running. QueryPerformanceCounter() returns the current performance-counter value, measured again in counts.

Referring to the Luna example for his DirectX 10 book, he does:
1
2
3
4
__int64 countsPerSec;
double secondsPerCount;
QueryPerformanceFrequency((LARGE_INTEGER*)&countsPerSec);
secondsPerCount = 1.0 / (double)countsPerSec;


To get the current time in counts, once per frame, he does:
1
2
__int64 currentTime;
QueryPerformanceCounter((LARGE_INTEGER*)&currentTime);


And to get the time difference in seconds between the current time and the time gotten in the last frame, he does:
1
2
3
4
5
6
7
double deltaTime;
__int64 currentTime;
__int64 prevTime;

//... (currentTime and prevTime get initialized using QueryPerformanceCounter() )

deltaTime = (currentTime - prevTime)*secondsPerCount;


Using these in a similar way should give you a smooth timer behavior.

__int64 is a 64-bit integer: http://msdn.microsoft.com/en-us/library/29dh1w7z%28v=vs.80%29.aspx
Last edited on
@Ogoyant

Thanks Again! You have been a godsend! It is very reliable! I would never have found this without your help.
Topic archived. No new replies allowed.