milliseconds system time

I am a bit lost as I am new to C++ and programming in general. Can someone please help me with the following code.

#include <stdlib.h>
#include <stdio.h>
#include <fstream>
#include <iostream>
#include <windows.h>

using namespace std;

int main()
{
ofstream myfile;
SYSTEMTIME st;

for (int i = 0; i < 500; i++)
{
GetSystemTime(&st);
cout << st.wMilliseconds << ", " << i << endl;
if (myfile.is_open())
{
myfile << st.wMilliseconds << "," << i << endl;
}
else
{
myfile.open("data.csv", ios::out | ios::app);
}
}
return 0;
}

ms i
12 47
12 48
12 49
12 50
12 51
28 52 16ms delay
28 53
28 54
28 55
28 56

in the csv file, as above, after every 12 to 13 increment of i there is a skip of 15ms to 16ms. I do not understand why the ms is not a uniform increment. Is there another way I can get a uniform increment of ms? Thank you.
I found an article below

TimerResolution is an application to change the resolution of the default windows timer. The standard timer on Windows XP can vary between 10 and 25 milliseconds. Therefore if your code uses a timer or sleep value less than the timer resolution on your system you won’t be getting the results you expect.
Thanks Cire.
Topic archived. No new replies allowed.