• Forum
  • Lounge
  • Do you think memory conservation concern

 
Do you think memory conservation concerns will be a thing of the past in the future?

I am reading c++ books to improve my programming skills, and I came across the pointers section and it got me thinking. A long time ago programmers would have to be careful with memory because they only had so much of it, nowadays memory is less of a concern because we have computers that come with multiple gigs of ram, but we still need to be careful with memory in the sense that we still have a limited supply to work with. I was wondering if you think that in 10 or 15 years, memory problems will no longer be a problem at all or they will be a very very small concern, because by then we will have multiple terabytes of ram available to buy. What do you think?
When we have multiple terabytes of RAM, we'll also be loading terabytes of data. While using memory efficiently won't be a major concern for many applications (text editors, music players, maybe browsers depending on how the internet improves), it will never become irrelevant.

-Albatross
Last edited on
also, when writing a kernel, wiki.osdev.org is very clear that you cannot be willy nilly with memory
Imagine server with 32GB of memory. Now imagine that it should handle about 4000 users simultaneously. Think how much unique data per user you can afford.

Controllers with only several megabytes of RAM or embedded systems with only handful of memory are also widespread.
I spent the last few years programming servers with anywhere between 0.5 and 1.5 TB of RAM (they can go up to 16 TB btw). Guess what, I had to be very careful with memory allocation, and I had to recover from std::bad_alloc, too. Those servers do a lot of things at the same time, and even small negligence can multiply very fast.
Wow, I've barely even used a quarter of my 2 TB hard drive, how the hell do you use 1.5 TB of RAM?
Well he did say a server :P It probably was for a pretty large company that had a ton of customers and data.
chrisname: minecraft
I can't get Minecraft to use more than 200MB. Other people swear to me it stays above 1GB for them.
Last edited on
idk... maybe with recent updates its more conservative. but a while ago my friend was playing and it was eating up all of his ram. and it was a gaming machine too
It depends very much on how the JVM decides to run the garbage collector. I've seen some cases where it waits until it uses up all the available memory before running it, and other cases where it runs it every 30 seconds or so.
My dad said: "When I was in school, we thought that we'd never need more than a kilobyte of ram for anything!"

Now most programs take megabytes at a time, and games take 1-2 gigabytes, or 4 or more gigabytes if they are heavy games. I think that as we acquire more resources, we'll find new ways to use them...
I was wondering if you think that in 10 or 15 years, memory problems will no longer be a problem at all or they will be a very very small concern

It depends on the nature of the software / problem you are trying to solve. In some cases, it is already not a concern at all, and in some cases it will probably always be a concern. I mean, there are problems we would very much like to solve, which have exponential space complexity. Pretty much you can say that there are problems that in all likelihood, we will never be able to solve because of memory limits ( as well as time ).

However, I think that at this point, and even more so in the future, it's the larger scale memory concerns that you need to worry about, not the small scale. In general, you don't need to worry about using bit fields, or chars or shorts instead of int's wherever you can. In general, you just might need to worry about using this or that data structure or algorithm. Then again it depends.
Usually the software that requires large amounts of memory requires it because it has to operate on a large dataset. A game, for instance, may have a single texture that requires 4 MiB, and it has to have dozens of these textures in memory at once (though the ones on screen would be in video memory, you can't load textures directly to video RAM, and you would probably want to cache them in system RAM). Like htwirn said, there's not much point worrying about the size of the variables you are using in most projects because most of your memory will be used by things that you just can't make any smaller. You can't really keep compressed textures in system memory because you will have to decompress them before copying them to video memory.

It's usually best to just use whole words, because the CPU generally reads word-sized blocks and then discards what it doesn't need. I only use smaller data types and bitfields when I want the compiler to enforce range constraints (e.g. a variable that only needs to go up to 255 will be a uint8_t, and then I won't have to do my own range checks) or if a structure needs to be an exact size.
Last edited on
There's always going to be special cases where large amounts of memory is beneficial. However, we're already past the point of being sloppy and inefficient with memory with few problems. It occasionally bites us so perhaps a little bit further? I think somewhere around 16gB is the maximum for a conventional use. Everything past that is special case.

Of course, I'm just guessing. Not much to go on given how zig zag history has been concerning the matter.
I imagine that as we get more we'll still find ways to somehow use it all up, and if we don't then great as we can have all program's load up on initial OS start and just sit idle unless we want em XD
Although I feel RAM concerns may be less of a problem I also fear that people will become careless when dealing with memory, and either end up designing program's that waste memory pointlessly, or just end up making silly mistakes all the time and not cleaning up after themselves creating big leaks, leaks that probably won't e noticed for a while until everything crumbles a once.

Just to have my 2cents with the minecraft, it would steal 2GB from my laptop and still crash saying that it's ran out of memory because my laptop only has 3GB to offer... I think on my PC it uses less, even though I can now turn all the settings up. I wonder if they try sacrificing RAM for processing power on smaller machines.

I do know that in the source code isn't very up for conserving ram exactly. Their was one but where the event called for right clicking with on an entity (which was polymorphised) onto wolves would then run a test to see if you had meat in your hand. So a special case.
I it was 1.6 when it changed and the source code now had a Boolean value, on every item called isWolfFavoriteFood. I'm not sure if JVM is able to compress the data together but if it does I still make that out to be 32MB of RAM gone if it uses 1b per bool and packs them into the same bytes. If it works kinda like C/C++ where the field aligned also comes into play I see it as one byte for the bool, which is stored on a 4B chunk, X256 for the the number of items totalling 1KB RAM gone.
This used to be no RAM, just a special case.
It depends on what you're doing.

In a critical application like a trading system supporting colocated services, you have be extremely careful with all resources.

But, then you see server software written JavaScript using Node.js, they clearly don't care too much about system resources on a single computer as they tend to scale horizontally and don't care too much about a single node.
Topic archived. No new replies allowed.