What are couple of un-arguable examples of premature optimization and premature pessimisation.

Dear ppl,
I have been seeing many guidelines to avoid premature optimizing and premature pessimization. But could not find couple of un-arguable examples of both.
I'll really appreciate if someone can share the same over here.
Thanks alot for any help :)
> I have been seeing many guidelines
Where?
> where?
I have been recently reading "C++ coding standards" by Herb Sutter and these guidelines are present as item numbers 8 and 9.
Last edited on
Here is a very clever premature optimisation to calculate the inverse square root of a number ( https://en.wikipedia.org/wiki/Fast_inverse_square_root ):

http://quick-bench.com/TbzMA1woGM_iWD6g-jDWNdpjJbI

Many years ago this was a good idea.

Now, it's not; just using the library function is faster. Processors are better than they were, and compilers are enormously smarter than they were. Someone who wrote this premature optimisation now would be creating slower programs with source code very difficult to understand.
Here is a very clever premature optimisation to calculate the inverse square root of a number.
I know the one. It's probably my favorite hack of all time. That I know of, anyway.

Christian Hansen does a good job of explaining it here:
https://web.archive.org/web/20181229222651/http://h14s.p5r.org/2012/09/0x5f3759df.html
Last edited on
The point about something being premature is that it was optimized before it was known it was going to be a problem. Therefore it's literally impossible to tell whether something has been optimized prematurely just by looking at the code, let alone in isolation. You need to look at two things:
* What was the code like before being optimized and what is it like now, in terms of complexity?
* How was the performance of that particular section and of the program as a whole affected?

A premature optimization is not just an optimization that's done while the performance characteristics of the program are unknown. It also has to significantly increase the complexity of the code. Premature optimization is bad not inherently, but because it's a misallocation of resources (programmer time, in this case).
Additionally, the change should have a negligible effect in the overall performance of the entire system, even if it did have significant effects when measured in isolation. For example, an optimization might rework a piece of code to run a hundred times faster by using SIMD and cache optimization, but it would be useless if the entire application is network-bound, or if that particular section represents only a tiny fraction of the total execution time.
To expand on helios's points:

The injunction against premature optimization is that it spends time and effort on an unproven need. (In other words, you might not actually need it, meaning time and effort spent on it is a waste.)

───────────────────────────────────────────────────────────────────────
When coding, you should first write code that is clear and understandable.
───────────────────────────────────────────────────────────────────────
After you have your program running, only then should you spend time observing where the program could be improved. Improvements fall in several categories:

  • speed
  • resource usage (memory, file I/O, etc)
  • size

There are many other considerations, but those are the top three (not counting UX issues). Once you have identified places needing improvement, prioritize them and spend effort on the worst offenders.


This is not to say you shouldn't spend some time finding a fairly optimal solution to your problem. But I would argue that this is just part of exploring it and learning how to solve it. It may not be the most optimal solution, but it can certainly avoid being a bad solution if you spend a minimally sufficient amount of effort to understand it, instead of just throwing something together.

The trade off is how much time (== money) you want to apply to a problem in isolation. Once the application is up and running, you can see which parts of it need help to improve the function of the whole.

Hope this helps.
Thankyou everyone for the excellent insights.
Though I was looking for examples of premature optimization like below is an example of premature pessimization :
 
X++;  // as writing ++X doesn't increase code complexity but is more efficient, this is an example of premature pessimization. 


Nonetheless, the points mentioned in this post gave me a clearer understanding of premature optimization.
Well... No.
This is why I said that you shouldn't just look at a piece of code in isolation. The snippet above is so isolated that, personally, I don't even know what that code means.
I mean, this:
1
2
//for (int i = 0; i < n; i++)
for (int i = 0; i < n; ++i)
is not the same as this:
1
2
//for (auto i = v.begin(); i != v.end(); i++)
for (auto i = v.begin(); i != v.end(); ++i)
or this:
1
2
//for (auto i = whatever.begin(); i != whatever.end(); i++)
for (auto i = whatever.begin(); i != whatever.end(); ++i) //Note: preincrement involves disk access for some reason 

But let's ignore for a moment the implementation details of preincrement and postincrement. What about this?
1
2
3
std::vector<std::string> v = {"a", "b"};
for (auto i = v.begin(); i != v.end(); i++)
    a_very_expensive_operation_that_takes_days_to_complete(*i);
Who cares if you preincrement or postincrement the iterator when the inside of the loop absolutely dwarfs everything else by several orders of magnitude?

So, no, I don't agree that postincrementing instead of preincrementing in a standalone statement is unarguably (i.e. beyond discussion and unconditionally) premature pessimization.
Last edited on
its also an implementation detail. Some future computer could have ++x and x++ equally well performing; it may already be true for smart modern compilers (I don't know). If not timed on the target system(s) to see if it was an improvement, its premature.
Coincidentally, the pre vs post increment guidance is currently discussed on C++ Code Guidelines issue tracker: https://github.com/isocpp/CppCoreGuidelines/issues/1322
@helios,

Did you mean "e.end()" or "v.end()" in your example code (5 times total)?
Oops. Fixed.
If you always write pre-increment by default I wouldn't consider it premature optimization. It's just as easy to write a pre-increment as it is to write a post-increment so why not just write pre-increment and don't have to think about it again.

On the other hand, if you have a large code base that has been written with post-increment and you go back and change everything to pre-increment just because you think it will be faster then that would probably count as premature optimization.
Last edited on
Topic archived. No new replies allowed.