Can anybody help me out?

Below solution works fine.But when i use unordered map at e place of map it gives time limit exceed.Can anybody suggest me why it is like that?

https://codeforces.com/contest/1234/problem/B2

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
#include <bits/stdc++.h>
#include <string>
using namespace std;
 
 
int main()
{
      ios_base::sync_with_stdio(0); cin.tie(0); cout.tie(0);
      int n,k;
      cin>>n>>k;
      map<int,int>m; 
      deque<int>q;
      for (int i=1; i<=n; i++)
      {
            int num;
            cin>>num;
            if (m[num]==1) {} 
            else
            {
                if ((int)q.size()==k)
                {
                    int l=q.front();
                    m[l]=0;
                    q.pop_front();
                }
                q.push_back(num);
                m[num]=1;
            }
      }
      cout<<q.size()<<"\n";
      while(q.empty()==false)
      {
          cout<<q.back()<<" ";
          q.pop_back();
      }
      cout<<"\n";
}
U-map can degrade to O(N) performance if the hash keys are not friendly to your data.
map retains O lg(N) even when things go wrong.
you can get back to O(1) if you hand-roll your own hash table into a vector with no collisions if the data is well suited to it. Some data cannot be hashed clean easily.

also, remember, that a O(1) where 1 takes 20 min is slower than O(N*N) where N*N takes 2 nanoseconds until N becomes big enough to slow the N*N part more than the O(1).
Last edited on
@jonnin May you please give an small example so that it gives me better understanding how things are going?
Not easily. I would have to craft something that performs poorly in U-map and then show it working better with regular map. Let me think on that one. You can search the web on this issue too -- should say what I said with many more words and much more depth.

I could show you how to roll your own table...
you just want an oversized array/vector and a function such that
arrayindex = f(data);
such that vector[f(data)] is your guy, and F should be as fast as you can make it. Making F can be a major effort for weird data, or very simple.
if you allow collisions in your table (and sometimes this cannot be avoided) you need to handle that. Handling this makes it slower... its best if you can find a way to not collide at all.
Last edited on
Topic archived. No new replies allowed.