Importance of storage

I want to know that whether storage is an issue with respect to current hardware and software market?? Because in certain conditions we have to choose between some types of programming etc like between OOP and procedural programming. I know that object oriented programs are of larger size than procedural programs. But does storage is even an serious issue while choosing the best method to solve our problem?

P.S Forgive me if my question seems childish .I am new to this field so I was just thinking about this issue.
It's a valid question. Someone with more experience can probably give a different perspective, but here's what I have to say:

Picking coding standards, styles, and figuring out your whole development workflow for your product is certainly important, and putting effort into planning your code structure will absolutely save you time and money.

But not for the reason of worrying about the size of the executable file. Most bloat in an executable file's size is from debugging symbols, which can be removed in a release product anyway. This will happen whether it's OOP or procedural. If your program uses resources like images or any type of saved data, it's likely that that information will vastly outweigh the size of the executable itself. Many applications have gigabytes of resources, but the exe file itself is still small. Look at how large apps like visual studio, any modern game, or photoshop are.

Many applications won't have a super-huge exe size, but will have dynamically-linked libraries (.dll files on Windows). Those can add up if you have a lot of them, but it still pales in comparison to any multimedia resources. For example, I just looked at the Debug folder for one of the large codebases of a product we have, and even in Debug mode with tons of .exe, .pdb (debug), and .xml files, the size of the directory is still only 100 MB. The exe files themselves are on the order of kilobytes to a few megabytes. Static linking will increase the size of the exe because it's pretty much like you're combining the dll and the exe into one entity. This could happen whether or not you're using OOP.

I have never heard of an instance where the size of the EXE is the bottleneck or the complaint from the customer. I would imagine on some embedded system it would matter, but I don't know much about that.

But no, the bottleneck is not executable size, in my opinion/experience. The bottleneck is maintenance. You're going to be reading code and debugging a hell of a lot more than you are writing code. And that's where you or your company has to decide what is the most maintainable code structure. This doesn't necessary just mean choosing between OOP and procedural. Nothing is ever completely one or the other, especially if we're talking about a "multi-paradigm" language like C++. C die-hards will say OOP brings bloat and complexity, and that inheritance especially can be horrible to maintain if done wrong, and OOP die-hards will say C programs tend to look disgusting, be unmaintainable, and overuse pointers and what-not. Not picking a side.

Wikipedia happens to have a whole page dedicated to Criticisms of C++, although I can't vouch for how unbiased it is. https://en.wikipedia.org/wiki/Criticism_of_C%2B%2B
* Many of these criticisms are related to C++ and not OOP itself, so don't go just off that. For example, languages like C# don't have as long of a compile time because of constructs like JIT compilation. Also, the C++ language itself is super complicated to parse.

https://stackoverflow.com/questions/5673770/does-the-size-of-the-exe-affect-the-execution-speed
Last edited on
Oh thanks a lot. This is very useful information for me.
storage is huge on some systems, like mobile devices or embedded computer.

Even a fairly low end (but modern) PC has gigs of ram and if not a full TB of disk, its at least half that now.

Only the smallest embedded systems require you to write code in odd styles for either performance or size reasons. There was a therad on here not long ago where someone ran their arduino out of space and could do no more, for example and another arduino programmer friend of mine wrote some wonky DIY code to avoid pulling in a library that was eating space he could not afford to get 1 or 2 simple functions out of it (string processing related).

You will know it if you are on a wristwatch sized machine or not when you design. At that point, you may have to consider these issues.

Executable size does matter if you go nuts with it. A static linked executable in debug mode can easily fill 10s of megabytes and the disk has to read it, memory has to hold it, cache has to juggle it, and so on. It causes slowness. Most of this is fine in release builds on a modern computer; only an extremely large program will trigger these problems these days.

Unless you are doing something unusual, targeting tiny systems or writing critical real time code or getting into the aerospace or medical industry, you should write code that is clean as described above. If you are doing the weird stuff, well, we did all that in the 80s and 90s and how to do it is well documented and well understood. Its ugly to go back to those styles, but they do still work when there is a need.

There is no catch all. Sometimes smaller programs are slower than big ones. Sometimes the other way around (note the voodoo of inline / unrolled loops etc code). Sometimes linking a library to get 1 routine costs you a tone (bc that routine calls 2 dozen more..) and sometimes not. Much of this stuff is case by case ... and again, only really matters on oddball platforms.


Last edited on
Topic archived. No new replies allowed.