staticconstunsignedint d = 5;
staticconstunsignedlongint M = 10000;
In main, I created an array for object of type sample.
1 2 3 4
constint N = 10;
The compile ($ g++ main.cpp free_bndry.cpp) goes through, but when I execute ./a.exe it gives me Segmentation fault (core dumped).
When I set constint N = 1; instead of 10, Segmentation fault (core dumped) goes away.
My machine has 8G of memory but windows is using 70% of it.
What is the issue?
How much memory do I need to make M 1 million, and N 40?
Your bm_sample array will take almost 2 MB on stack. Default stack size on MSVC and gcc from MinGW is 1 MB: stack owerflow here.
1) Increase stack size (not recommended)
2) Create objects in memory like bradw proposed (recommended)
3) Learn why the heck Windows devour 6 GB of memory alone.
Thank you bardw and MiiNiPaa.
The program I am writing needs much larger arrays, let's say an array of 5*1,000,000*40. The challenge here is that I can not delete them to free the memory. To start the program I have to keep all variables. But, later on I can delete them little by little. The time also matters so I can not write them in a file and call them according to my memory capacity.
I would be able to free the memory from windows occupation to 6GB. Is there any chance I can use at lease 4 GB of it if I follow bradw's suggestion?