Why does vector's capacity does not reduce when they are assigned fewer elements.

Hi,

I tried running the following code (main()) and found that capacity/space reserved for vectors doesn't reduce even when it holds fewer element than before.

[1] Is there any advantage to this kind of implementation ?
I understand that it reduces re-allocations but it also could cause
wastage of a lot of space.
[2] Is the decision to not reduce vector size based on some statistical data
or it's responsibility lies with the programmer to call "shrink_to_fit"
if such scenario arises ?

main.cpp
1
2
3
4
5
6
7
8
int main ()
{
    std::vector<int> v(100000,3);
    std::cout << "capacity before reassigning 2 elements : " << v.capacity() << std::endl;
    v = {1,2};
    std::cout << "capacity after reassigning 2 elements : " << v.capacity() << std::endl;
  return 0;
}



capacity before reassigning 2 elements : 100000
capacity after reassigning 2 elements : 100000


Thanks for any help :)
Although vectors take care of allocation, it's the programmer's call when it comes to resizing or shrinking the vector.
You can do this with vector::resize()

edit:
Use vector::shrink_to_fit like Konstantin2 said, thanks helios
Last edited on
1. Yes, it reduces re-allocations which costs much time.
2. Responsibility lies with the programmer to call "shrink_to_fit".
std::vector::resize() only changes the capacity if the parameter is larger than the current capacity.
shrink_to_fit() is the only member function that can shrink the capacity of a vector then?
Last edited on
clear() may or may not deallocate the internal array, zeroing the capacity.
you should avoid using this unless you are sure you need to.
in most generic uses, shrinking to fit is going to be a double hit on the efficiency of the code; it will take a hit to shrink and it will take another hit to re-grow. Better leave it be (no cost to not shrink, no cost to re-grow to fill the space) unless you did something rather large and need that memory released. In general, your computer usually has much more memory than you need and wasting some is usually harmless. If you are consuming all your memory and activating virtual memory, then you must consider this kind of solution.
Last edited on
As I recall, vectors grow by 50% when they need space. So your vector might be using 50% more RAM than necessary.

Jonnin points out the disadvantages of calling shrink-to-fit, but I'll mention some benefits:
- when you know that you won't add more items.
- when your data structure contains lots and lots of vectors, or a few vectors that are very large.
- when memory size matters.
If memory memory matters and you can't know the size ahead of time to call reserve(), then std::vector is probably not the best data structure. The minimum memory required to call shrink_to_fit() is (capacity() + size()) * sizeof(T).
std::deque may be a better fit in that situation.
I could see that it all depends on what the requirement is. Maybe if we are coding for memory scarce environments like embedded systems, we might need to use shrink_to_fit to make space available for our future requirements.
In other cases, it seems reasonable to provide best performance at the cost of small memory wastage. (by avoiding any re-allocations caused by shrink_to_fit.

Thanks to all for your inputs :)
Topic archived. No new replies allowed.