I'm writing a program with Visual Studio Express 2008 using Windows Forms, that is
C++/CLI I think - not sure, and because I want to use OpenGL I am using a Native Window object or something like that( I am not pretty sure for all the word cause it's kind complicating for me but I think using managed with unmanaged code has something to do with my problem, hope you'll understand).
Now, I have a bit of code where I dynamically allocate memory for an array with the keyword new. All running good until I try to allocate memory around 512*512*373 chars it is throwing an exception. I know I have enough memory free. I have read similar problems others have and all the solution suggested to them was like "the bound are wrong" or "the step is wrong" etc. I asure you everything is correct, besides it's working with smaller numbers. I have found something that I did not understand and has to do with the CoTaskMemAlloc() function.
Ok, as I say before, I don't need that kind of answer. "The index is wrong". If the index is wrong explain with the allocation fails. What does the index has to do with the allocation? Besides, it's working when I try to allocate less memory.
CoTaskMemAlloc returns NULL on failure. Where is the check for that?
First, while you have enough memory, what is another reason for new to throw an exception?
new throws an exception when the memory requested is unable to be allocated. The memory you are requesting must be contiguous, so if there is not enough contiguous (unfragmented) memory available to the operating system/program... you're out of luck.
@cire : Ok now we are starting to talking.(I updated my code above to see that I do check if the returned value is NULL)
I'm trying to allocate 512*512*373*3 floats. When this happening(the "not enough contiguous memory") how can someone handle it? Is it a dead-end?
I read somewhere, and here is where CoTaskMemAlloc() falls into, that you use CoTaskMemAlloc() when a proccess tries to allocate memory that crosses process boundaries. I don't really understand that but I get that every proccess has a limit on how much memory will use(if this true, how do you increase that limit?). Note that I noticed when I terminate the program, Visual Studio posts that 2 programs has terminated, the "MyProgramName.exe : Managed" and the "MyProgramName.exe : Native"
The program ' MyProgramName.exe: Managed' has exited with code 0 (0x0).
The program ' MyProgramName.exe: Native' has exited with code 0 (0x0).
and I sure this has something to do with the mixing of managed and unmanaged code.
@kbw : I was going to ingnore you but I won't because it's rude. I don't need that kind of answers means I have all the answers of that kind not all the answers of all kind.
On my system that's 1173356544 bytes, which is over a gigabyte of memory.
When this happening(the "not enough contiguous memory") how can someone handle it? Is it a dead-end?
You could kill tasks to make it less likely memory is fragmented between multiple programs prior to running your program. You could try to work with less memory. You might pre-allocate the array at a global scope and avoid the heap.
If it is something to do with mixing of managed and unmanaged code, I have no insight to give.
You probably right, 1,1GB contiguous memory with 1,6 free memory must be hard to find, but I've tried it with 832*832*494 unsigned chars which are around 350MB. This time I had to allocate to arrays of this, the allocation with new fails on the first array and when I try to allocate them with CoTaskMemAlloc(), it allocates the first but fails on second. What, it can't even find 700MB? But I guess this is it. And in my situation I really need contiguous memory, cause in case you wonder about the arrays size, I'm working with volume data and those are the datasets that fail, and OpenGL only takes arrays which they allocate as one-dimension contiguous array. Anyway, if someone has anything more to add, please go ahead.