Std::bad_alloc on Windows 10 but not on Ubuntu

Machine: 32 GB RAM Intel i7-6700 @ 3.4 GHz

I am streaming a text file into a vector with 52 doubles/element and 16 int/element. With same code and text file, I get an instance of std::bad_alloc on Windows 10 but running it on Ubuntu 18.04.1 runs perfectly. Is there any way to fix this issue on Windows? Any help appreciated.

Text file has ~3,000,000 elements to stream into vector.
Windows isn’t the problem. Windows has done you a favor by pointing out that your code has a problem that needs to be fixed.

Don’t use a std::vector to store 3 million elements. (Either use a std::deque or update your allocator to use a custom memory pool that can handle the volume of data being applied to it.)
I've never heard of std::deque, looked it up and it seems like it would do the job. May I ask what the difference between them are and why I shouldn't use std::vector?
I'll have to disagree with Duthomhas. Assuming your process won't be long-running and won't need any other memory, there's no reason why std::vector couldn't do the job here. It's only 1.34 GiB of memory.
Is it possible you compiled your program for x86 by mistake, rather than x86-64?

There's several differences between std::deque and std::vector, but what Duthomhas is pointing out here is that std::vector requires a section of contiguous memory in the address space to hold its entire capacity, while std::deque can work with smaller sections that store fragments of the entire sequence. After some time, your address space can become fragmented such that it might have, say, 1 GB of free space in total, but the largest contiguous free region might not be larger than 1 MB. In such cases, std::deque might be able to make use of that memory, while std::vector would not.
I just tried to run the program with std::deque and it built fine but when I ran it, I got no output/results. My compiler on Ubuntu is 64-bit but the compiler on Windows is 32-bit. I'm now trying to figure out how to add the MinGW 64-bit compiler on Qt in Windows, struggling a bit but I'm sure I'll figure it out. I'll update results once I'm able to add the compiler.

Edit: I'm not the most experienced programmer as you may have noticed, hence my confusion in many cases.
Last edited on
After installing the 64bit compiler in Windows, I was able to run the program without any exceptions. I still used std::vector because std::deque did not work properly.
Your code might not have worked properly with std::deque if it assumes a particular memory layout. For example,
1
2
3
T v;
v.resize(1000000);
memset(&v[0], 0, 1000000 * sizeof(int));
This code works if T == std::vector<int>, but not if T == std::deque<int>.
I didn’t think about the 32/64-bit issue. Yes, on Linux you'll get a 64-bit binary by default, but on Windows you will often get a 32-bit binary by default.

I’m still inclined to think that there is another issue in here somewhere...
Maybe, but it's able to run properly. I wonder if std::list would be a better option though? From what I read, it's much faster.
A linked list is very rarely the best solution for a given problem.
lumbeezl wrote:
From what I read, it's much faster.


There are a lot of misconceptions about the speed / efficiency of STL containers. There several things to take into account, C++ has move semantics, cache effects and concurrency, all of which alter things quite a lot - compared to the traditional concepts of containers. I have often been quite surprised at how well std::vector performs compared to other containers.

Remember that discussions about containers might be independent of any language, C++ has the above features which other languages may not have. As helios says a linked list would rarely be a solution in any language.

As well as that, when talking about what is faster, it depends on what operation one is talking about, they all have different Big O efficiencies per container and operation.

Also remember that when dealing with microscopic amounts of data like 52 elements, it probably won't make a lot of difference anyway. Test things with at least 1 million data points to get any meaningful comparisons. Be careful how you do the timing as well, there are traps there too.

Good Luck !!
I like using std::vector, it seems to get the job done with what I do, but it's a vector<struct> with 52 doubles and the data sets I have range between 3-94 millions points of data. I just don't know if std::vector is the quickest way to analyze, sort, etc with the data.
I just don't know if std::vector is the quickest way to analyze, sort, etc with the data.
Yes. Any other data structure would be slower than a vector at any of those.

You should only consider a different data structure if you need to frequently add or remove elements anywhere other than the end of the vector, or if you need to use a different memory layout to resolve memory fragmentation issues. If you only need to read, sort, or edit the data in-place, a vector is the optimal structure.
You should only consider a different data structure if you need to frequently add or remove elements anywhere other than the end of the vector,

And even then you should profile the code to see which data structure is faster. With today's hardware a vector may surprise you even when adding and removing elements anywhere in the vector. Cache hits and misses can significantly skew the results. Since a std::vector is in contiguous memory cache misses can be much lower than something like a std::list which can be spread out in memory.


> And even then you should profile the code to see which data structure is faster. With today's hardware
> a vector may surprise you even when adding and removing elements anywhere in the vector.

Yes. Some simple benchmarks: http://www.cplusplus.com/forum/beginner/206320/#msg976016


There may be compelling reasons for using std::list<> even if its performance may not be the best:

1. Modifiers of std::list<> do not invalidate iterators, references or pointers to the other elements.

2. Elements can be transferred from one std::list<T,A> to another std::list<T,A> (same T, compatible A) without actually moving or copying the elements, and without invalidating iterators, references or pointers to the transferred elements.
http://en.cppreference.com/w/cpp/container/list/splice

3. std::list<> provides 'strong-exception-guarantees' for modifiers; if an exception is thrown during a standard-library operation on the list (say, the copy constructor of the value_type throws), the list rolls back to the state in which it was before the operation started. In contrast, std::vector<> provides only the 'basic-exception-guarantee' for some modifiers; if an exception is thrown during these standard-library operation on the vector, the vector remains in a 'good state' and no resources are leaked.
http://stroustrup.com/3rd_safe.pdf
You should only consider a different data structure if you need to frequently add or remove elements anywhere other than the end of the vector


If this is the case, then a vector it is! It has always seemed to do the job and I've been able to stream up to ~93 millions elements into it and analyze it appropriately.
Topic archived. No new replies allowed.