Big O of a function with a function that has not been implemented

Hi, I have a function that I need to find the Big O of, and have already written my notes down, but am not sure whether or not my answer is correct.

So what I've deduced from the code below is the following:
1. Vector creation is... constant (?)
2. The first two loops = 2N
3. Sort is said to use NlogN, so I'll use that.
4. While loop does things up to N-1 times if I recall correctly, so that = N
5. For loop has insertBefore, which could have either logN, N, or NlogN...
6. I know that the last for loop is = N, but I'm stuck with insertBefore
7. insertBefore seems to work up to N-1 in its worst case scenario, which means my guess is that it is O(N)?

Adding them all together, I get O(N^2), but I am not 100% sure about this.
Can someone assist me with this? Thank you
The code can be found below.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
void Set::unite(const Set& set1, const Set& set2)
{
    vector<ItemType> v;
    for (Node* p1 = set1.m_head->m_next; p1 != set1.m_head; p1 = p1->m_next)
        v.push_back(p1->m_value);
    for (Node* p2 = set2.m_head->m_next; p2 != set2.m_head; p2 = p2->m_next)
        v.push_back(p2->m_value);

      // sorted using O(N log N)
    sort(v.begin(), v.end());
    while (m_head->next != m_head)
        doErase(m_head->m_next);
    for (size_t k = 0; k < v.size(); k++)
    {
        if (k == 0  ||  v[k] != v[k-1])
	    insertBefore(m_head, v[k]);
    }
}
You need to provide more information about your data structure to be able to analyze the time complexity. This looks like a linked list! Sets are not often implemented in terms of linked lists.

O(1) You can assume the vector's default initialization takes constant time.

O(n + m) we push_back N + M things.
O(1) std::vector::push_back takes amortized constant time

O((n + m) log (n + m)) - std::sort is often implemented as a hybrid sort; the actual time complexity is implementation-dependent but you can hopefully assume it takes linearithmic time at infinity.

O(k) you erase a linear number of elements.
O(1) erasure at a fixed position takes constant time in a linked structure

O(n + m) you insert N + M elements before the head of the list
O(1) insertion at a fixed position takes constant time in a linked structure - there is no need to walk the list given that position.

Compute the sum of products:
1
2
O(1) + (O(n+m) * O(1)) + O((n+m)*log(n+m)) + (O(k) * O(1)) + (O(n+m) * (O(1))) =
O((n+m)*log(n+m)) + O(k)


Where n and m are the sizes of each argument; k is the size of *this
Last edited on
That makes so much sense!
I understand it now, thank you!
If I remember it right, that final answer is useful for knowing what you are looking at, but the formal answer is either N log(N) or N log(N) + N (I am a little fuzzy on whether the +N is absorbed by the first dominate term or not, it seems like it would be... usually only list the highest term?).

The formal answer is either O(N log(N)) or O(N log(N) + N)

Since N grows more slowly than N log(N), it would get absorbed. And you're right -- under the assumption that n, m, and k all grow at the same rate, k = m = n and the final answer is O(n log(n)).

The answer with multiple terms variables is useful in case one of those variables grows faster than the others. For instance, if k grows really fast relative to n + m, e.g., if the old list always has (n + m)^2 elements (that is, if O(k) = O((m + n)^2)), then the whole system actually has a complexity of
O((n+m)*log(n+m)) + O(k) = O((n+m)*log(n+m)) + O((m + n)^2) = O((m + n)^2)
which is much worse.
Last edited on
Topic archived. No new replies allowed.