#include<iostream>
#include<vector>
void makeArray(std::vector<int>&n_array)
{
std::cout<<"makeArray called"<<std::endl;
for (int i = 0; i<1000001; ++i) //fill array with data
{n_array[i] = i * 2;}
}
void printArray(std::vector<int>&n_array)
{
for (int i=0; i<1000001; ++i)
{std::cout<<n_array[i];}
std::cout<<"printArray called"<<std::endl;
}
int main()
{
std::vector<int>n_array(1000000);
makeArray(n_array);
//printArray(n_array);
}
if I execute the program as above with printArray commented out, the function returns MakeArray called. When I comment out MakeArray called the fx will report is has been called. Both uncommented I am returned nothing.
Question: the PrintArray will not print elements of index > 10000. What am I doing that is wrong? Its a clue to me why both functions are not being executed. How do I print and manipulate data elements greater than 10000?
In makeArray and in printArray you access you vector with an index out of range, that's undefined behaviour. It's like an normal array. Last index is size()-1.
The stl is designed for performance not for safety so you need to be careful like with normal arrays and pointers.
There are better ways to deal with vectors.
1. use vector::at() , if you get the index wrong you get an out_of_range error, so you know what's wrong.
Here is another example where I can't get my million ints going. Just change the myarray element number in vector<int>myarray(1000) to 10k or 100k or 1million. It works fine at 1000. I did some bounds checking with myarray.at(i), which came out fine. I tried intentional OOB values and the myarray.at(i) threw the error it supposed to.
Can anyone help me here? This undefined behavior is killin' me : )
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18
#include <iostream>
#include <vector>
int main ()
{
vector<int>myarray(1000);
// assign some values:
for (int i=0; i<myarray.size(); i++) {myarray.at(i) = i+1;}
// print content:
std::cout << "myarray contains:";
for (int i=0; i<myarray.size(); i++)
std::cout << ' ' << myarray.at(i);
std::cout << '\n';
return 0;
}
Thx
Yet, is there no fix for the code below to have it work with 1000000 ints?
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
#include <iostream>
#include <vector>
int main ()
{
std::vector<int>myarray(1000000);
// assign some values:
for (int i=0; i<myarray.size(); i++) {myarray.at(i) = i+1;}
// print content:
std::cout << "myarray contains:";
for (int i=0; i<myarray.size(); i++)
{std::cout << ' ' << myarray.at(i);}
std::cout << '\n';
return 0;
}
I ran the code additionally in c++ shell online and am getting the same result. After 10,000 ints the program functions don't print, etc., undefined behavior. I haven't gotten to templates yet.
To repeat jlb's question, when you say "undefined behaviour", what do you mean? What are you actually seeing, and what leads you to conclude that it's the result of undefined behaviour in your program?
Why is flushing the buffer needed? Why the intervention.
I don't really know, apart from that the output buffer only has a certain size (I tried to find out how big it might be, but couldn't find anything concrete - it probably varies with the implementation); and I am not sure whether the compiler takes opportunities to automatically flush the buffer (such as encountering '\n').
I guess the best thing to do is what doug4 has done - cause a flush periodically with std::endl.
One might be able to find out how big the output buffer is using these functions:
Once initialized, std::cout is tie()'d to std::cin and std::wcout is tie()'d to std::wcin, meaning that any input operation on std::cin executes std::cout.flush() (via std::basic_istream::sentry's constructor).
Why is flushing the buffer needed? Why the intervention.
It shouldn't be necessary since the output buffer should flush when it is filled. The problem is probably to do with how your operating system implemented the console terminal.
Note that your original program worked okay for me (fixing the out of bounds access in your loops). Your later code works without modification, but you really should start using the correct type of variable in your loop counters to avoid possible problems with numeric overflow (size_t is the correct type in this case).
When I originally responded, I thought that somehow you blew past the streambuf in std::cout, and you could not insert into std::cout any more. After more thought, I think it's related instead to the maximum line length of your terminal.
My knowledge of std::cout and the terminal buffer is not complete, either. I'm not possitive why you got the results you did, but these are my suspicions.
When std::cout's streambuf was filled, it automatically flushed its contents to the terminal screen.
Because you never added a new line, the output on the terminal screen grew longer and longer on the first line.
I suspect that the terminal has a maximum line length, and anything longer than the maximum length gets thrown in the bit bucket. Automatic line wrapping does not seem to be taking place.
If this it the case, simply adding '\n' (which does not flush the stream) instead of std::endl (which does flush the stream) at line 16 would also do the trick. The line would "wrap", and filling of the streambuf would trigger automatic flushing. It would be interesting to test.
1) That is probably also due to the way the terminal is implemented - it probably has it's own limit. You could send the output to a file, I guess that's what would normally happen to a large amount of output.
2) Pointers are faster, but I doubt the difference is worth worrying about. The STL is supposed to be efficient to use, I guess the safety of using a container often outweighs the dangers of using something lower level like pointers . Also in terms of speed, one would have to specify what one meant: speed of a find operation, inserting or something else? Is the container sorted? Have a look at the various STL containers and their efficiencies for various operations such as insert, sort, find etc. All this gets back to whether or not one is creating their own container, in which case things like pointers and other low level techniques would be used.
There are other containers like std::unordered_set which is efficient - it stores it's data in buckets. Basically, if there were 1 million items, they might be stored in 1'000 buckets, and it's easy to determine which bucket a particular item is in - the buckets are in order (0 to 999, 1000 to 1999 etc). So we have just divided the size of the problem by 1000 right at the start. In reality, a hashing function is used, and one can optionally specify how many buckets to use.
I am using the vector container for a binary searches. I get around 20 recursive passes to find a number in an ordered vector of a million elements. Writing a raw pointer solution is easy enough, but vector did make it easier. I ran into the same buffer problem with the dynamically allocated array [new] [delete] / raw pointers as with vector and was starting to see limitations there that didn't exist until the buffer issue was corrected. Vector kept OOB in check, but honestly if a pointer array went over its range it would provide an error as well.