Simple Memory Question

Hi Folks,
I was using the program below to help me understand some memory concepts. As far as I know the array created in the function below should be destroyed when program execution leaves the function. Does "destroyed" just mean that the array has gone out of scope or does it also refer to the release of memory used by the array? The reason I ask is that when I check the memory used by the program by looking at the task manager, the program uses 380k of memory before it enters the function (at point "cin>>a") but when it exits the function (at point "cin>>b") the task manager indicates that it is using 1164kb of memory. Why is the memory not released when the array goes out of scope?
I have tried rewriting the array declaration inside the function using dynamic memory allocation with new[] and delete[] operators and this works perfect i.e. the memory usage, as reported by the windows task manager, inidicates that memory usage drops back down to 380kb at point b. Why is this mot the case when the array goes out of scope with memory assigned from the stack?

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
#include <iostream>

void myFunction();
using namespace std;

int main() 
{
	int a; 
	cin>>a; //Using approx 380k at this point

	myFunction();

	int b; 
	cin>>b; //Using approx 1164k at this point????
}

void myFunction()
{
    double myArray[100000];
}
Memory allocation is more complex than you think. Programs have more memory allocated than they're actually using. Even if memory is freed, it might not be reclaimed by the OS right away, depending on whether any other memory is allocated in that sector and <insert any number of other factors here>.

Furthermore, Task Manager is ok for getting a rough idea of how much resources your program is using, but I would not rely on it as a final word.


And lastly... your 'myArray' would not consume memory as Task Manager sees it anyway because it is being placed on the stack and not on the heap. Stack space isn't allocated as new vars are created... rather your program allocates a bunch of memory for the stack at startup, then just uses that space as needed.


Does "destroyed" just mean that the array has gone out of scope or does it also refer to the release of memory used by the array?


When programming you should consider it being both (out of scope and memory is no longer allocated). Though the latter might not happen immediately.
Hi Disch,
Thanks for your reply.

Task manager seems to give a decent estimate to me:

Before program execution enters the function to declare the array, the memory used is 384kb. When it exits the function the memory used is 1164kb.

1164kb - 384kb = 780kb which is nearly equal to the memory reserved for the array [100,000 x 8 bytes for a double = 800kb].

I have left the task manager open for a few minutes when the program had exited the function at point "cin>>b" above, and the memory never drops back down to 384kb when the array goes out of scope.

If, as you say, a block of memory is assigned from the stack during compilation how can I see how much memory was reserved and how much memory is actually used/returned to the OS during runtime. If task manager is not suitable for this is there another program which can show me the details? Like I mentioned in the original post I can see the memory usage dropping when the array goes out of scope when I dynamically allocate memory. How can I see the actual memory usage when I don't use dynamic memory?

The reason I believe you should use dynamic memory allocation is for cases when you do not now how much memory you intened to assign until runtime. The size of the array above is known at compile time so I do not need to use dynamic memory allocation, however from what I can "see" the program uses a higher average amount of memory over the lifetime of the program when I assign memory from the stack since it does not seem to release the memory no matter how long I wait after the array geos out of scope.
Before program execution enters the function to declare the array, the memory used is 384kb. When it exits the function the memory used is 1164kb.


Maybe I am incorrect in assuming that Task Manager does not include stack space in its memory usage statistics. It's possible it does.

It's also possible that your stack is growing dynamically because you're putting an awful lot of data on it. 800K is pretty big (not counting what's already on the stack from program initialization).

From a look at the default settings in VS2012:
Default Stack Commit Size = 4 KB
Default Stack Reserve Size = 1 MB

This suggests to me that you'd start with 1 MB stack space allocated, and it would grow (if possible) in 4 KB steps beyond that. Either that or it grows in 4K steps up to 1 MB.... but that doesn't make much sense.

http://msdn.microsoft.com/en-us/library/8cxs58a6.aspx

At any rate... once stack space is reserved, it's reserved. IE: it's allocated. Stack space is not heap space, and it doesn't get allocated/released the same way heap space does.

If task manager is not suitable for this is there another program which can show me the details?


I'm not familiar with one, as I've never had to do this kind of fine analysis.

The reason I believe you should use dynamic memory allocation is for cases when you do not now how much memory you intened to assign until runtime.


That's one reason.
Another reason is to put memory on the heap so it doesn't suck up the limited stack space you have.

however from what I can "see" the program uses a higher average amount of memory over the lifetime of the program when I assign memory from the stack since it does not seem to release the memory no matter how long I wait after the array geos out of scope.


This is Task Manager deceiving you. Or rather, it's illustrating memory usage in an overly simplified format.

If your program has 1 MB stack space reserved... then you can use 5 bytes of that space or you can use the full 1 MB and your program will not have to allocate any more memory.

Stack memory doesn't get released when it goes out of scope because it's on the stack.




EDIT:

Simplified stack memory overview.

Let's say your program has a 1000 byte stack space (which is unrealistically small). So when the program starts, the OS sets aside 1000 bytes of contiguous memory for it to use as its stack.

You can think of this 1000 bytes as being 'new'd or dynamically allocated if it helps.

 
[1000 bytes] <unused>


Now let's look at a simple main:

1
2
3
4
5
6
7
int main()
{
    int a;
    int b;

    someOtherFunc();
}


main has 2 vars on the stack. Each 4 bytes (we'll assume 1 int = 4 bytes). So the stack space looks like this:

1
2
3
[4 bytes] 'a' in main
[4 bytes] 'b' in main
[992 bytes] <unused>


When someOtherFunc is called:

1
2
3
4
void someOtherFunc()
{
    int array[4];
}


So upon entry to this function... more stack space gets used:

1
2
3
4
[4 bytes] 'a' in main
[4 bytes] 'b' in main
[16 bytes] 'array' in someOtherFunc
[976 bytes] <unused>


When someOtherFunc exits... 'array' goes out of scope and dies... but the stack is still 1000 bytes:

1
2
3
[4 bytes] 'a' in main
[4 bytes] 'b' in main
[992 bytes] <unused>




The stack typically does not grow/shrink but rather stays the same size throughout the life of the program (special circumstances may change this, but this is basically how it works).

So program memory usage generally does not fluctuate with how much stack space is used... because the stack space is all allocated up front.
Last edited on
To put it simply, the stack is where small variables are put when you do not use the new keyword. The heap is used when you do use that keyword to allocate memory.

According to your program, when you use the new and delete commands, memory is allocated or unallocated. What actually happens is above your control and is handled by the OS. You don't have to worry about it though, the OS is very good about handling everything.
Topic archived. No new replies allowed.