Memory Exhaustion?

An exercise in Stroupstrup's book is to write a program involving memory allocation with an infinite loop and let it run until it crashes. He says the crash will be due to memory exhaustion. He also suggests researching the topic.

I did and am frightened by what I found. The vast majority of the pages I read were related to hacker attacks. Apparently, memory exhaustion/stack overflow can be used as a form of denial of service attack. Many of these said that Chrome is vulnerable to this attack. (from 2008/2009) Only a handful dealt with non-attack memory exhaustion and discussed a solution. ini_set('memory_limit',-1).

What is Stroupstrup trying to get at with this exercise? What will happen if I run a program with an infinite loop? I am very reluctant to run such a program because I am fearful of doing something I can't undo. How does a programmer estimate the amount of memory that a program might need in order to avoid this situation? For example, the number of int variables times 4 bytes per int?
Have no fear! I suggest you run the program and if something goes wrong and your pc is burnt i'll sent you mine :)
What will happen if I run a program with an infinite loop?

Most probably, your Operating System will terminate the program, and memory that you've allocated will be freed. Most of modern operating systems does that.

But if you compile&run your code on kernel level. It will be quite dangerous for your system.
Last edited on
Obviously a system can no longer function properly when there is no more available memory.
Virtually no programs handle out-of-memory events gracefully, therefore they'll simply crash if they need more memory and cannot get it. If there is a swap file/partition, then swapping will occur first before memory is actually fully exhausted, which essentially makes the system unusable during this time. In the case of a server, it will be unable to serve (m)any requests.
If someone finds an easy way to make a server allocate a lot of memory, that can indeed be an issue.

Not so much for a browser... it's kinda expected. An easy and somewhat popular DoS attack has been to post multiple huge, blank png images on forums, essentially making everyone's browser trying to view the topic run out of memory quickly.
Similarly, playing some Flash videos used to (and maybe still) cause/s gradually but quickly increasing memory usage, probably by not freeing uncompressed frames after rendering. As long as the browser reacts gracefully (that includes terminating the program in a more controlled manner), this is more or less harmless, though. However, if a program continues after a failed allocation like nothing happened, that can open the gates for more serious security issues.

Anyway, you can't physically damage your computer by having the memory run out. Just make sure you don't have any unsaved work open, just in case. I don't think Stroustrup's intention was to breed new security experts either with that exercise.
So I tried to exhaust the memory on my computer. I use a laptop running XP that has 1 G of RAM. I used CTRL+ALT+DEL / Performance to see how much available memory there was. Before running the program there was 467K of available memory. I eventually ran the following program.
1
2
3
4
5
6
7
8
9
10
11
int main ()
{
	for (int i = 0; i < 150000; ++i) {
		double* pi = new double[2000];
		*pi = i * 2.5;
		cout << i << "  " << *pi << endl;
	}

	keep_window_open();
    return 0;
}   

It stopped at iteration 133,033. My computer was almost non-responsive. After a few minutes, VC++ gave the following error message:
"Unhandled exception. std::bad_alloc at memory location (hex address)."
Is that was is supposed to happen? Is this what usually happens?
Thanks for your help.
It's what's supposed to happen, yes. new throws a std::bad_alloc exception when it fails.
Topic archived. No new replies allowed.