Determine whether a triangle can be built from a given set of edges

Jul 2, 2022 at 7:49am
Here's the task:
https://app.codility.com/programmers/lessons/6-sorting/triangle/
And here's a simple O(n log n) solution for it (what the site expects):

1
2
3
4
5
6
7
8
9
10
11
int solution(std::vector<int>& A) {
	if (A.size() <= 2)
		return 0;

	std::sort(A.begin(), A.end());

	for (size_t i = 0; i < A.size() - 2; ++i)
		if ((long long)A[i] + A[i + 1] > A[i + 2])
			return 1;
	return 0;
}


What I'm a little curious about is to know if there's possibility to create an O(n) solution with a not-a-very-bad space complexity for the task.
Following my experience with previous tasks I tried to find a relationship/pattern between the results for a few tests on this current task. For example, I tried (using pen and paper) to see if the following items can guide me to a way to find the relationship/pattern for a better solution:
- max, min or average element (of the array)
- using bit manipulation like XOR
- number of elements of the array
- or using a side array

I dedicated about five hours to it but up to now I haven't found such a pattern or particular observation using the items/ideas above except for the last one, like:
- using a std::vector<size_t> of size INT_MAX and add each positive element in the index equal to it into the vector and that way to get it sorted.
- using a 2D std::vector of N*N (N: size of array) and insert all various sequences of the elements into it and finally search on it.

Apart from being sure if the above two ways using a side array can lead to a better solution in terms of time complexity or not, they're obviously quite inefficient in terms of space complexity.
What I need?

What I need is merely a hint! (not code)
I like myself to find a better solution (if any). So if possible just give me some hint or indirect idea for that purpose, please. Then I will try to create the code and then you can post your own ones and together compare them. This way I can take advantages of your experience well.
Thanks in advance.
Last edited on Jul 2, 2022 at 7:49am
Jul 2, 2022 at 8:14am
Low-hanging fruit: use a non-comparison sorting algorithm.

Last edited on Jul 2, 2022 at 8:14am
Jul 3, 2022 at 8:38am
I believe "using a side array" will convey generally the same concept as non-comparison sorting algorithms that I'd thought of already. But that doesn't always work for the task well. for example, for array A, A: { 2, 6, 1, 1000, 5 } we expectedly need to use a side/aux array of size max: 1000, initializing it with zeros and then add the number of each element in the index equivalent to it in that aux array. I believe we will have the worst time and space complexity for that array A. Just suppose traversing 1000 elements of array aux for a 5-element size of array A.

Probably there's isn't a better solution than O(n log n) for the task. (As the task also expects that)
Jul 3, 2022 at 11:37am
Probably there's isn't a better solution than O(n log n) for the task

I don't believe there is a O(n) solution for this with good space-complexity.

You can use radix-sort and other non-comparison sorting algorithms to "cheat" your way into a better time complexity. But in reality they would likely take the same amount of time.

Something like radix sort is technically O(n) time, but is really O(k*n) time, where k is the length of the largest number. Radix sort only performs better than quicksort if k < log(n).

So you'd have O(k*n) or O(n*log(n)) time as your best implementations. Again though, O(k*n) is still technically O(n) depending on who you ask.


Things like this should show you the limitations of looking at time complexity alone - it's only a small part of the bigger picture.
Jul 3, 2022 at 1:59pm
Careful, your solution assumes that the sorted array has three contiguous (not just consecutive) values that satisfy the solution.
Jul 3, 2022 at 2:02pm
As for radix sort when we have an array with 10 numbers with max: 100
Time complexity:
O(n log n) = O(10 * 1)
O(n*k) = O(10 * 100)
This is apart from measuring space complexity of radix for which an array of 100 elements is needed.

I assume non-comparison sorting algorithms (with an acceptable space complexity) lead to a better time complexity than O(n log n) only when the elements are evenly distributed along the array with no huge difference in them.
Jul 3, 2022 at 11:40pm
Non-comparison sorts are not affected by element distribution.
Jul 4, 2022 at 7:09am
Here is the solution:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
#include<algorithm>
int solution(vector<int> &A) {
    // write your code in C++14 (g++ 6.2.0)
    int size = (int)A.size();
    if (size < 3) {
        return 0;
    }
    
    vector<long long> x;
    for (int i = 0; i < size; i++) {
        x.push_back(A[i]);
    }
    sort(x.begin(), x.end());
    int i = 0;
    while (i < size - 2) {
        if ( x[i] + x[i + 1] > x[i + 2]){
            return 1;
        } 
        i++;
    }
    return 0;
}
Jul 4, 2022 at 7:22am
@denver2020

Uhh..
Jul 4, 2022 at 7:49am
@Duthomhas
You mean applying a non-comparison sort algorithm for both the arrays below is the same?
A: { 2, 6, 1, 1000, 5 }
B: { 3, 12, 44, 32, 28 }

@denver2020 This is the O(n log n) time complexity I talked about already but there's subtle error on line 16 of your code.
Last edited on Jul 4, 2022 at 7:49am
Topic archived. No new replies allowed.