Array storage?

can any one tell me how to store 0 to 10 power 50 values in a 1d array.when i go above 7*10^7 my compiler says size of array too large.any help is highly appreciated.

Regards
vinoth
Am I to understand you want to store

100,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000

integers in an array?

If so, I believe you will need all the RAM in all the PCs in the entire universe, including any parallel universes that might exist.
atleast 10^20 values in a super computer? is it possible?
Assuming every computer has 4GB of RAM, 1 billion 32-bit integers will consume 4GB of memory. You will need 10^20/10^9 = 10^11, or 100 billion such machines.
ok just tell me atleast the maximum value that can be stored in 1d array assuming a computer has 4gb of ram?
arrays are no good, your going to run into stack overflows (at least that's what happened to me).

You can use pointers and memory blocks by doing this:

1
2
char * memoryBlock;
memoryBlock = new char[size];

where:

"memoryBlock" is your "array" name (not really an array, but you can treat it like one, it works exactly the same as an array)

"char" is the data type you want to use (you can substitute it with int, bool, ect)

and "size" is the size of the memory block you want (its just like declaring an array)


with 4GB of ram, you can theoretically hold 4 billion "char" numbers (range of 0 - 255 unsigned) or 1 billion "int" numbers (range of 0 to a bit over 4 billion, again, unsigned, only 1 billion because each "int" is 4 bytes)

However, in practice it will be less. It depends on how much ram is being used by your system and other programs as well as any overhead from your own program that you are running. Also, this method of using memory blocks requires a chunk of uninterrupted memory, contagious memory. You can look at ram as a giant 1D array being shared among all your other programs. If there is a program sitting in the middle of your ram (giant 1D array) then the maximum size of your memory block is effectively halved since you only have half of the ram that's continuous.

if you want to store larger numbers and are only concerned about the first 16 or so digits of precision (meaning you will lose anything after the first 16 digits), then you can use the "double" memory type. The range is -10^308 to 10^308

If you want to go really really high precision and keep every single digit yet still be able to deal with huge numbers, you will need to look into using specialized large number classes...unfortunately, i'm just a beginner myself and i don't know enough to help you any further.
Last edited on
Even Variable int size can not store such a huge number. But if you are too much interested in finding available ram. Try this. A foolish try. Where the program ends in an error or crashes note the MB value. Program may need some modifications as i am a beginner and new to C++.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
#include<iostream>
#include<iomanip.h>
using namespace std;

void create_array(unsigned int);
int main()
{
    
      unsigned int size;
      for(size=10000;size>0;size++)                           
      {
                         create_array(size);    //Create array of given size
      }                  
      system("PAUSE");
      return 0;
      
}
void create_array(unsigned int size)
{
     double ram= static_cast<double>(size*sizeof(int))/1000000;
     int array[size];
     cout<<setprecision(3)<<setw(6)<<ram<<'\r';                     //in MBs
}

thank you faken and khan for your suggestions.still iam not convinced.faken can you explain your idea with a help of a small program.sorry to disturb you.

Regards
vinoth
Sorry, i don't understand what you want me to explain.

First of all, do you wish to store a very large number or do you want to store a lot of numbers?
I think faken explained you very well, if long int is too less for you, you should try google and find some special classes that will allow you that as faken said.
AR Khan your method will eventually cause the OS to crash the program for trying to allocate too much memory.

kmvinoth. if your machine is a 64-bit machine that means it can use up to 16 exabytes(giga->terra->peta->exa). this is roughly 2 quintillion ints or 2 times 10 power 18. this is about 1/(10^32) of the amount of memory you would like to use to store every single integer from 0->10^50. since there are no hard drives(let alone RAM) that store an XB I would suggest that you come up with a different method of storing all that memory.

What do you need all of this for?
The pointer returned by new is not a memory block. It's a pointer to an array. An array is only truly considered a memory block when pointed to by a void *.

AR Khan's "solution" doesn't make any sense to me. What is it supposed to do? Stress-test the stack?

I've read it three times and I'm still not sure what OP is asking. I think he's trying to allocate a ludicrously-sized array on the stack and the compiler is understandably complaining.
Let's go over memory allocation, shall we?

The stack
The stack is a memory region of fixed sized that a program uses to store local data and the local data of the function that has called the current function, and so on until we get to the entry point of the program. It should be assumed to be fairly limited, and nothing too huge should be stored on it.
Examples of data stored on the stack:
1
2
3
4
5
int a;
int b[10];
std::vector<int> c;
char d[]="string";
int e[ONE_MILLION]; //Likely to produce a stack overflow (see below). 

When too much data has been pushed onto the stack, usually because a poorly written recursive function ended up in infinite recursion or because a very large array was pushed, an error known as a stack overflow occurs. It's a very serious kind of memory corruption and potentially dangerous under the wrong conditions. The best thing that can happen after a stack overflow is that the system detects the program is trying to write to memory it shoudn't be and crashes it. All modern OSs (AFAIK) do this. The worst thing that can happen is the system allows the program to arbitrarily overwrite its own data or code, or another program's (including the OS's) data or code.

The heap
The heap is the rest of available memory, in few words. It's many times larger than the stack and is used to do most of the allocations in a program.
Examples of heap allocations:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
int *a=new int;
int *b=new int[10];
std::vector<int> *c=new std::vector<int>;
//Example d cannot be rewritten as a heap allocation in a single line.
int *e=new int[ONE_MILLION]; //Unlikely (but possible) to fail to allocate.

//All heap-allocated data has to be deallocated sooner or later:

//Notice that delete and delete[] are used in different cases. Using one when
//the other one should be used is dangerous.
delete a;
delete[] b;
delete c;
delete[] e;

The only restrictions for heap allocations ar the size of the heap, and whether it's possible to allocate n adjacent bytes.
For example: we have a heap of size 10, where all odd bytes are used. This leaves 5 even bytes free. In such scenario, trying to allocate 5 bytes will fail, even though there are, indeed, 5 bytes free. However, allocating 1 byte five times won't fail.

One final note that is sometimes relevant: stack allocation is always faster than heap allocation because the latter need to do system calls. The stack is preallocated and all it takes is to move a pointer back and forth.

Use the stack to store small things and local data. Use the heap to store big things, things of size unknown at compile time, and things that need to remain after the function returns.



Oh, an exabyte is EB or EiB, not XB.
hi faken i want to store lot of numbers.say form zero to 10^50. hope i made my question clear.
Last edited on
The question is clear, and the answer is clear: you can't. You need many orders of magnitude more storage than all the computers on the planet combined.

EDIT: not to mention the CPU processing time required to process each one is probably longer than the current age of the universe.
Last edited on
I have seen EB and XB used for exabyte, but perhaps XB is wrong.

kmvinoth, my question wasn't what are you trying, it was: what is this for?

if you are trying to store all the numbers in sequential order in an array this is a pointless task as to access the number you have to use the number itself.
It doesn't matter. The fact that OP wants to store this many numbers means s/he wants to perform calculations on them.

Assuming that 10^9 (1 billion) numbers can be processed in 1 second (ie, 1GHz), with approximately 3 * 10^7 seconds in a year, it will take roughly 10^(50-16) = 10^34 years to process all of them, give or take a couple hundred million years.

If the age of the universe is 15 billion years, 1.5*10^10, the computation will complete in only 10^24 complete lifetimes of the universe.

That is some bad ass maths lol!
Am I the only one who feels it prudent to ask what the OP needs all this memory for? Obviously his approach is wrong. Maybe if he told us what he was trying to do we could recommend better ways to approach the problem.

Considering all the complex programs out that don't use this much RAM, there's no way the OP's problem demands this much RAM.
Topic archived. No new replies allowed.