time in ms since last update

I use this to check the time since the last update:

1
2
3
4
5
6
7
      // time since last update
    float t =  ((float) clock() - lastUpdate) / CLOCKS_PER_SEC;
    t *= 1000; // ms
    
    printf("time since last: %f\n", t);

    lastUpdate = clock();


But my values have strange jumps:

time since last: 21.740000
time since last: 0.538000
time since last: 0.268000
time since last: 0.674000
time since last: 8.456000
time since last: 15.274000
time since last: 0.584000
time since last: 0.548000
time since last: 21.036001
time since last: 0.556000
time since last: 0.354000
time since last: 0.748000
time since last: 8.172000
time since last: 15.240000
time since last: 0.550000
time since last: 0.352000

Is it my program or is this not a good way to measure time?
Also now i think about it. My program runs at 60 frames max, but often less, around 50 frames. 1000ms/50 = 20ms. So a value between zero is extreme low.
On the other hand it only updates certain parts when there is input from a webcam, and if there is it has to calculate really a lot. So it can be normal right?
It really depends on what sort of timing you're using to control your loop. I'm assuming you're doing this on a regular desktop machine running Linux/OSX/Windows. The operating systems have to juggle a whole bunch of things, so occasionally your tasks will go into a queue and wait. Especially things like accessing system resources (like the USB port or whatever your webcam is connected to). You could do a little better by using the right event-based software timing mechanism, but you won't be able to do things very fast.

Further complicating this is that most timers and clock measurements aren't that accurate because again, the round trip from your code into the system calls and to the hardware and back are not tightly controlled.

So actually the answer is a bit of both.
Topic archived. No new replies allowed.