timestamp v/s clock_t

Hi,

I am trying to note the time it takes for a function to perform a certain job. I noticed that the results I get using timestamp and clock_t are diff. I tried a stopwatch using my phone, and it appears the result given by timestamp is closer to the true time.

Can anyone explain why there is the difference ? I am pasting below my code. I dont mind small differences, but the output shows a diff of 12 secs.

1
2
3
4
5
6
7
8
9
10
    timestamp st = now();
    clock_t sc = clock();

// ---------------
            <do something>
// ---------------
    double total_time = now()-st;
    double total_time_clock = (double)(clock() - sc)/CLOCKS_PER_SEC;

    cout<<" Done! using timestamp: "<<total_time<<" sec., and using clock_t: "<<total_time_clock<<" sec.\n";


and, the output is
Done! using timestamp: 14.5845 sec., and using clock_t: 26.0007 sec.
Alas, clock() is not required to be very accurate.
What's a timestamp?
clock() returns the approximate processor time used and not the real clock time.
errr, i just noticed that timestamp that im using, is not a standard C++ functionality, but is actually a struct provided by one of the libs I imported and used. here is its definition. It in turn uses time.h

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
# include <sys/time.h>
# include <unistd.h>

  typedef struct timeval timestamp;

  static inline double operator - (const timestamp &t1, const timestamp &t2)
  {
	return (double)(t1.tv_sec  - t2.tv_sec) +
	       1.0e-6f*(t1.tv_usec - t2.tv_usec);
  }

  static inline timestamp now()
  {
	timestamp t;
	gettimeofday(&t, 0);
	return t;
  }
Topic archived. No new replies allowed.