How I can optimize my timer?

Hello everyone!
I made a simple timer, but when I run the program, it's laggy.

How I can make it run faster, what are my mistakes?:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
#include <iostream>
#include <time.h>

using namespace std;

int main()
{
    int timer = 0;
    int timeStored;
    while (true)
        {
            timeStored = time(NULL);
            if ( timeStored != time(NULL))
            {
                timer++;
                cout << timer << endl;
            }
        }
    return 0;
}
Last edited on
get rid of the cout statement :)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
#include <iostream>
#include <ctime>
#include <iomanip>
#include <thread>
#include <chrono>

int main()
{
    const std::time_t start_time = std::time(nullptr) ;
    const std::time_t interval_secs = 120 ;

    std::time_t now ;
    std::time_t last_printed_time = start_time - 1 ;

    while( std::difftime( (now=std::time(nullptr)), start_time ) < interval_secs )
    {
        if( now > last_printed_time )
        {
            std::cout << std::put_time( std::localtime( std::addressof(now) ), "%c\n" ) ;
            last_printed_time = now ;
        }

        else std::this_thread::sleep_for( std::chrono::milliseconds(100) ) ;
    }
}
Thank you all very much! :)
Perhaps a bit of simpler code.


1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
#include <iostream>
#include <thread>
#include <chrono>

using namespace std;

int main()
{
	int timer = 0;

	while (true)
	{
		std::this_thread::sleep_for(std::chrono::seconds(1));
		std::cout << (timer += 1) << '\n';
	}
	return 0;
}


With this code, you keep track of seconds yourself.


However, your code isn't a timer at all - It's not at all keeping track of time.


The issue with your code is that you don't know when you're checking. By the time you make "timestored" equal time(NULL) and then check if time(NULL) has changed, a second hasn't passed! C++ is very fast, but your program is relying on a time difference great enough from line 12 to 13 that time(NULL) actually gets incremented to the next second in order to become different than "timestored".


To better explain, lets say it takes 50 milliseconds to assign timestored to time(NULL). Then, it takes another 50 milliseconds to evaluate the if statement. In that time, a single second hasn't even passed! The only time it prints out a number is when that small fraction of a second is all the time that was left in the current second before it passed, hence why it likely runs very differently every time you compile it. (Note that that's not actually how long it takes, but it was just for the example. Your code is being processed very quickly, at fractions of a second.)

Therefore, you can see that your program wasn't keeping track of time at all. Even if you could ensure that you check the condition in the next second, the program wouldn't be accurate about when in the next second, you'd have a ridiculously large error margin.

EDIT: I hope that was understandable, let me know if you need it explained again.
Last edited on
get rid of the cout statement :)

On any modern CPU, cout isn't going to be destructively slow. I've had my screen print out thousands of lines before, printing out several hundred a second. Although the screen wont show several hundred lines being printed in that fashion, it's obviously the limit of the screen's refresh rate.

Also, std::localtime has been deprecated in the language.
Last edited on
zapshe wrote:
std::localtime has been deprecated in the language.

I couldn’t find anything about it on Internet. Could you provide any reference, please?
> std::localtime has been deprecated in the language.

It has never been deprecated by either C or C++.
Thank you, JLBorges. Since I consider you a very reliable reference, I won’t investigate any more :)
Good news, IMHO. Since with the new c++20 standard an entire new date and time library will be added
https://mariusbancila.ro/blog/2018/03/27/cpp20-calendars-and-time-zones/
it seems there’s already more than I can chew even without having to deal with a new ‘locale’ scheme.
yes, cout is very slow (its terrible on windows and varies from terrible to not great on unix ). It has zero to do with the CPU. (the issue is the way the console writes text inside the OS which was never optimized for high performance).
I work with multiple GB sized files on my job. I often write little disposable parsers / fixers / etc to repair or scan these files when something goes wrong in our systems. They often look like (read file, logic to find or modify, cout repaired data). They can take 1/2 an hour to an hour to run if you just hit go. They take about 20 seconds to run if you redirect the console output to a file.


LOOK at what you wrote. HUNDREDS of lines per SECOND. I am processing millions of lines per second when it does not print to the screen...

Now, in your example, it may not matter at all if your code is busy-waiting anyway, on a *slowish* timer. That is fine, and in that case, the cout is indeed harmless. And, if you need the console spew, you need it. If its just info/spew for the sake of spewing, then cut it and you will loop faster.

its easy to test. write a program that writes just the first couple million integers to the screen in a for loop. run it twice, once to the console, once to a file (filename.exe > output.txt).
Last edited on
yes, cout is very slow (its terrible on windows and varies from terrible to not great on unix ). It has zero to do with the CPU. (the issue is the way the console writes text inside the OS which was never optimized for high performance).

CPU will always affect performance, whether or not cout is optimized or not. And I agree cout is slow, I said it's not destructively slow. Obviously if you're working with several gigabytes and you try to cout millions of times, there's going to be an impact. I myself corrected a friend's code who tried to output numbers millions of times and wondered why the program was taking so long to finish. But there's no reason to think it would impact this timer code.

LOOK at what you wrote. HUNDREDS of lines per SECOND. I am processing millions of lines per second when it does not print to the screen...

No contradiction. Hundreds of lines per second, and the OP's timer is only for writing out 1 line a second. There's no real reason to say to remove it, especially when the timer wasn't working correctly for other reasons. Plus, if the cout statement is removed, how would they know if the timer is working correctly if at all? Once the timer is implemented and they don't want to see every second tick away, then they'll remove it themselves.

I couldn’t find anything about it on Internet. Could you provide any reference, please?
It has never been deprecated by either C or C++.

My bad, deprecated on Visual Studio by Microsoft. Other compilers will run it fine, but Visual Studio deprecated it considering it unsafe and suggesting " localtime_s" instead.
Last edited on
Topic archived. No new replies allowed.