### how can I avoid using map

Hi guys.

In my code, I use a map to keep a certain class I created"

`map <vector<short>, Foo *> Foos;`

Where the vector short is three elements vector which holds the coordinates of a certain Foo pointer.

I suspect that this is slowing down my code. I there a nice trick to by pass the map usage.

`vector <vector <vector < Foo*> > >`

But I'm not sure.

Thank you for any help.

Yotam
It is unlikely that the map is slowing you down much, unless you are making copies of it everywhere instead of using const references...
Please explain how you intend to use the container and we will make our best attempts to find a nice solution. As of now, your usage looks a little bit nuts.

If you want to lookup a Foo pointer given a 3D coordinate, or something similar, why would you want logarithmic lookup when constant lookup is available? For example, nested vectors could provide a 3D array-like container that will outperform the map approach. One downside is that a system of floating-point coordinates would make the container huge...
Last edited on
Don't use vector<short>, use

 ``1234`` ``````struct my_vector { short x,y,z; };``````

instead. It will save you a lot of memory allocations, possibly improving efficiency to the tolerable level.

Also, you could tell us what you really try to accomplish. Maybe design of your program is bad.
+1 Abramus.

vector<> should be used when the extent is not known at compile time. In this case, it is always 3, so even
boost::array<> is better than std::vector<>, but creating a simple struct like Abramus gave is better since
it makes the code more expressive.
It seems to me that it depends on how this data is t be accessed. If you want to find a Foo* given three coordinates then the map looks very inefficient:
 ``1234567891011`` ``````map , Foo *> Foos; Foo* get_foo(int x, int y, int z) { // search Foos for a match? Takes ages } // alternatively: vector > > Foos; Foo* get_foo(int x, int y, int z) { return Foos[x][y][z]; // fast efficient }``````

You would be much better with the vector of vector of vector. But if you simply want to record the coordinate of each Foo* then I would be tempted to make a struct with your coords and your Foo* together and put them in a vector:
 ``123456`` ``````struct FooRec { int x, y, z; Foo* foo; }; std::list Foos; // record where each Foo is, no need to search based on location ``````

It really depends why this data is together and how you want to access it.
Last edited on
The problem with the above approach is that it consumes (range_of_x) * (range_of_y) * (range_of_z) * sizeof( Foo* ) bytes
of memory just for the pointers. So if the coordinate system is bounded by (0,0,0) and (99,99,99) then you have
100*100*100 = 1,000,000 pointers. If the bounds are (0,0,0) and (999,999,999) then you're over 4GB of data.

How about a hashmap that uses std::map to resolve collisions?

 ``123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102`` ``````#include #include #include #include using namespace std; struct Data { int x; int y; int z; int hash(int size) { return (((811*x+911*y+977*z))%size+size)%size; } bool operator<(Data data) const { if (xdata.x) return false; if (ydata.y) return false; if (z HashMap[HMAP_SIZE]; typedef map TreeMap; int main() { srand(time(0)); int hmap_size=sizeof(HashMap); int data_size=DATA_SIZE*sizeof(pair); double hmap_overhead=hmap_size/double(data_size); hmap_overhead*=100; cout << "hashmap size=" << hmap_size << endl; cout << "(max) data size=" << data_size << endl; cout << "hashmap (min) memory overhead="; cout << hmap_overhead << "%" << endl; cout << endl; HashMap hmap; TreeMap tmap; vector data_vec; data_vec.resize(DATA_SIZE); for (int i=0; i

EDIT: I forgot my RAND_MAX is ~32K... I'll cook something up later to fix this, if I'm not bored...
EDIT2: (somewhat) better hash function.
Last edited on
Hi.

Thanks for your replies. I think that I'll go with the struct solution combined with the 3d vector.

My code is trying to simulate the behavior of particle with a certain potential between them.

Each particle can move around freely as long as no forces are acting on it. The Forces are decreasing with the distance (much like the gravitational force) and I have a cutoff distance. My system is divided into boxes and I calculate the interactions between all the balls in each two neighboring boxes (as well as within a single box). Since I have roughly 50*50*50 boxes,I have to access the map 50*50*50*14=1750000 times.

running time of about 10 cycles per second is reasonable to me (there are other parameters to consider apart from the immediate interaction)

I hope my problem is clear

Yotam
If you only need sequential access to the data, why not use a list? You could implement a box as a linked list of balls and use the splice member function to move the balls from one box to another when necessary.