Pointers: Use for speed enhancement?

Hello. I have a giant data base of numbers which is about 0.5 GB. The data has the following form with millions of lines:

1
2
3
4
5
6
7
8
119.744158 0.255842 0.500000 0.999998 0.001938 -0.000228 0 
119.459564 0.284593 0.499999 0.999999 0.000190 -0.001584 0 
118.957437 0.502127 0.500000 0.999986 -0.003014 -0.004408 0 
118.680324 0.277113 0.500000 0.999996 0.001874 -0.002178 0 
118.393582 0.286742 0.499999 1.000000 0.000234 -0.000820 0 
.
.
.


My program performs a bunch of number crunching and processing to the database. The program starts by reading the data into a giant vector data structure. My program is broken down into many different functions with my vector data structure containing the database being passed around. Would I see increased performance if I was using pointers to pass the database around to different functions? I have read many sources that say to forget about pointers in C++ but in the case of perusing optimal performance with a very large database being passed around would they not be good to use?
It depends. How are you currently passing around the vector? If it is by value, then yes, it will be an improvement. If it is by reference, then no, no improvement will be gained.
References are often preferred over pointers in C++. Use a const reference if you don't want the function to be able to change the vector.
Would I see increased performance if I was using pointers to pass the database around to different functions?
Unless you're passing the vector by value (which is unlikely), you're already doing that implicitly.
Either post code or run a profiler to see if the vector being copied is taking a significant amount of time.
Use references when passing vector and be sure to call vector::reserve before you start reading the data into the vector so you can avoid reallocations.
Last edited on
Topic archived. No new replies allowed.