Array memory

How much memory does an integer array of size n use? Also, what exactly is memory complexity?
An array of integers occupies the following amount of memory:

(size_of_one_int) * (number_of_int_in_array)

For example, on a system in which an int was 8 bytes, an array of 10 int values would occupy

( 8 ) * ( 10 ) = 80

80 bytes of memory.

memory complexity sounds like a term for how much memory your program uses.
for example, if you wanted to give that in bytes,
in the above example your memory complexity is N*8, so if you had 10 integers, you use ~80 bytes, if you have 10000, you have 80000 and so on. I don't know if this term has a formal notation like big - O, it probably does.

The above is important due to the time space tradeoff. For example, to bucket sort 64 bit integers, you need 2^64 X sizeof(some integer type). So if you could have 0-255 repeats of any one value, that would take 2^64 bytes of memory. Most machines don't have that much, so the O(N) sort of 64 bit numbers is too heavy on the time-space tradeoff: you don't have the space available to get that time, and need to use a slower, but less memory hogging, algorithm. But if you were sorting billions of 1 byte values, that is only an array of 265 X 8 bytes, which will fit on any computer. So you can do that one .... see how this is useful info?
¿are you confusing bucket sort with count sort?
you may have whatever number of buckets you want on bucket sort.
maybe? whatever you get with (sorting lots of bytes)

unsigned int sorted[256] = {0};
for(all the data)
sorted[data]++;

If that is count sort, sure. ... wonder what bucket was, I get all the names mangled ... /googling...

... looks like count sort is bucket sort when the bucket size is 1? A specialized case?

Anyway, I think this is a form of bucket, where comparison is eliminated due to knowing what bucket to put it into and bucket size is only 1 wide.
Last edited on
COMPLEXITY

An algorithm’s complexity expresses how resource use increases as N grows larger. There are two components to complexity:

  • time (runtime or performance complexity)
  • space (memory used)

Continue reading here to learn more about algorithmic complexity
http://www.cplusplus.com/forum/general/226968/#msg1034344


BUCKET VS COUNTING SORT

A counting sort could be considered a subclass of bucket sort, but it is sufficiently unique that we consider it differently. Both algorithms have their own FAQ page:

http://www.cplusplus.com/faq/sequences/sequencing/sort-algorithms/bucket-sort/
http://www.cplusplus.com/faq/sequences/sequencing/sort-algorithms/counting-sort/

Just watching the GIFs on those pages is a useful exercise.


Hope this helps.
> Also, what exactly is memory complexity?
For example, most card games would be O(N) for the number of cards you have.

Chess would be O(N2) for an N-sized board.

A Rubik's cube would be O(N3) for an N-sized cube.

Sometimes, it gets worse...
http://man7.org/linux/man-pages/man1/grep.1.html
Known Bugs wrote:

Large repetition counts in the {n,m} construct may cause grep to use
lots of memory. In addition, certain other obscure regular expres‐
sions require exponential time and space, and may cause grep to run
out of memory.

Back-references are very slow, and may require exponential time.

salem c wrote:
A Rubik's cube would be O(N3) for an N-sized cube.

If N is the sidelength, the surface area is 6 * N^2, so I believe it's O(n2) :P
Correct me if I'm wrong.
depends on what are you counting? the stickers? The sides? there are 9 stickers per side on the standard one, X 6 sides...

or space needed to be solving it?

A counting sort could be considered a subclass of bucket sort, but it is sufficiently unique that we consider it differently. Both algorithms have their own FAQ page:

I learned this stuff before there was a web. Back when I picked this one up, I learned it as bucket... don't know if they had named it back then, or if the teacher knew the special case had a name. Anyway, point was the space consumed vs time taken. Honestly, we spent the most time on shell sort. Probably why it became a favorite... loved the weird analysis of it. I remember leaving the machines on overnight to get the run time of 1000 items in a bubble sort.

Last edited on
@jonnin

Sounds like whoever taught you the sorts was either a little unsure of the differences himself/herself or was rushing through it. Both sorts have existed for ages, and are very, very thoroughly covered by Knuth.

Also related is the radix sort.


@salem c

I disagree about the card games thing — card games often work on the suboptimal.

For example, Go Fish: Not every card you ask for is going to be available in another player’s hand — you will have to draw from the pool, and even toward the end of the game you are unlikely to draw the desired card. I don’t think you could construct an O(n²) scenario, but I don’t think you are likely to have anywhere near an O(n) scenario either (though I suppose one could be constructed). Without actually computing it, I would guess somewhere in ω(n log n), o(n²). Yes, that’s little O and little Ω.

For a game like Solitaire (say, Free Cell), it is even worse. As you build up the stacks, you have to touch each card in a stack just to move it around. This is definitely an Ω(n²).


Back to sorting algorithms, sorting a deck of cards is a very convenient way to think about sorting stuff. Particularly when you get into bucket sort and variations (like radix and counting). Again, you find that the algorithmic complexity is not going to be O(n) except in the single case of those sorts that recognize an already-sorted sequence after the first pass.
both :)
It was a high-school teacher, and we rushed it pretty hard. we did a full near college level datastructures & algorithms class + learning the language over 1 year. No OOP, though, was in pascal of all things... my first year of college was... dull due to this.
Last edited on
Topic archived. No new replies allowed.