C++ in real world, is it only me, or real world C++ is different from C++ in books and conferences?

I hesitate to respond to posts from new members here, because so many pose a question, then disappear after dropping some links in an edited post designed to spam the viewers.

Your post seems more genuine (in part because of the length and detail), and in part the nature of the question.

I've been at this since the early 80's. I took up C++ within a few years of it's "invention", when Stroustrup was the only contributor. I remember when templates were first released.

There is a resistance to change generally. C++03 dragged on for years as the "current" standard until C++11 was released. While the compiler developers attempted to comply, there were numerous buggy releases. I know a number of teams that refused to move on from Microsoft's Visual Studio 2008 (some from 2003) because what they had worked, and 2010's IDE was too much of a shock.

In that case, the tools and the old, stable compiler worked, and teams stuck with them. Attempts to update were met with pain and delay, often with abandonment (retaining the older compiler and tools).

That gets to be a habit.

There is a case to be made that it is no longer a similar problem, but old scars still hurt. My own observation is that compilers since about 2012 forward are similar enough to be interchangeable on a number of codebases, but there again, the older the codebase the tougher it can be.

For example (going back in time). I don't recall which version exactly, but one of Microsoft's compilers didn't handle private declarations in for loops correctly.

for( int n=0; n < 10; ++n ) {...}

In that code, 'n' isn't supposed to survived outside the scope of the brackets. In Microsoft, it did, for a certain frame of time. A lot of code was written which accidentally took advantage of that quirk. Updating to a newer compiler broke code unexpectedly.

Of course, this wasn't portable code anyway. The era of focus on writing portable code is still not quite "here" yet, though it should be, and may do insist on portability to some degree (often to other compilers on the same platform).

Since that wasn't always the case, and since supervisors tend to be older members, the old scars tend to drive decisions.

I would be curious to learn what versions of C++ coding style you've seen hanging on (in particular, if they've avoided moving from working C++11 toward C++17). I suspect the code you're describing is from even older standards.

We know the benefit is worth the risk, because the risk is small and the benefit is fewer bugs ongoing, and likely the pain of upgrade shows bugs that were never discovered in the first place.

That wisdom doesn't transfer the "corporate" mindset, though. It is everywhere, too. Engineering of machines in some companies have it even worse than this.

This is also a copypaste reddit post.

Edit: Profanity removed.

Niccolo, if you wanted to actually reply to a decent human that will be grateful for your post, I'd suggest posting in the real thread. It's only 2 days old :P [Edit: Not that we're NOT grateful here at cplusplus.com]
Last edited on
Ah....duped again, I was.
Ah....duped again, I was.

I'm sorry it had to be this way. But I read what you wrote and really appreciated the insight.
Responding to the original question, which noted the juxtaposition of the standards committee trying their hardest to not making breaking changes, even though companies try their hardest to not update compilers anyway.

I think it's inevitable because of C++'s long history, and just how a business works.
The business knows that its program works and does its job using that particular compiler version. "If it ain't broke, don't fix it."
Of course, a product should have regression testing against patches/changes, but you won't necessarily have 100% test coverage, especially if your system works with other real systems, that can only be simulated in testing.
C++ is used in so many different places, some quite low-level, and it sometimes just isn't worth the time, effort, and risk to guarantee that the newer compiler doesn't break existing code, even if the potential bug introduced was programmer's fault, and not C++'s fault.

e.g. when I updated from .NET 4.6.2 to .NET 4.7.2, something relating to ASP.NET broke. I forget where the fault lay. Happens in more than just C++, although I'd say C++ is particularly bad because it has more gaps where undefined or implementation-defined behavior can happen.

As far as how the standards committee makes changes... that I know less about. I know they do a thing where they initially have a vote on a feature, and the votes are {strongly in favor, in favor, neutral, against, strongly against}. They take into consideration the opinions of the against and strongly against even if there there more in favor, and back-and-forth discussion comes to see if a resolution can be made.

A recent CppCon talk about the difficulties in changing the standard to improve two places that currently aren't zero-overhead in C++ (exceptions and RTTI): https://www.youtube.com/watch?v=ARYP83yNAWk
Last edited on
Ah....duped again, I was.

I'm with zapshe, your comment was very insightful and much appreciated.

when I updated from .NET 4.6.2 to .NET 4.7.2

I remember when MS updated DirectX from 7 to 8. Newer versions were supposed to run apps using an older version without problems. Yet several DX games I played at the time broke. Horribly. The updates to run DX 8 made the games buggier.
Topic archived. No new replies allowed.