| A 06 (10) | ||||
|
This is the problem i am trying to solve: (link: http://opc.iarcs.org.in/index.php/problems/BOOKLIST )
This was my code:
My code is working, but because of vector::erase, it exceeds the time limit. Is there any other algorithm to solve this problem? Thank You | ||||
|
Last edited on
|
||||
| brandonator (110) | |
try adding cin.get(), at the end, the program will stop til the user presses enter, so you can read whats left on the screen
| |
|
Last edited on
|
|
| jumper007 (302) | |
@brandonator That doesn't even make sense. Why would you add cin.get() ??In <algorithm> there is a remove_if function which squeezes all values not removed to the front maintaining the order. This works if you can get the elements by value, and not index.If I am right, the vector::erase function performs at most O(N logN) operations while remove_if performs at most O(N) operations.
| |
|
Last edited on
|
|
| ne555 (4041) | |
vector::erase function performs at most O(N) operations (worst case is trying to delete the front element)remove_if() traverse the entire container, so O(N) @OP: your algorithm is O(M*N) and that's too much. There is an easy O(N^2) were you consider only the taken books, (not sure if fast enough) suppose that you want to borrow the K book, If there is a J, with J<=K, that was borrowed, then you need to take the K+1 book. | |
|
Last edited on
|
|
| andywestken (1950) | |
| Any idea what the i/p and o/p were when you exceeded the time limit? | |
|
|
|
| A 06 (10) | |
|
@brandonator there is no need of cin.get() because the program is compiled and run on an online judge. @andywestken they must be near to the limits of M and N: "You may assume that 1 ≤ M ≤ 1000000 and 1 ≤ N ≤ 4000." @ne55 I think N^2 will be okay, because n < 4000 @jumber007 I will surely try it thank you | |
|
Last edited on
|
|
| Duoas (6734) | |
| Keep in mind that remove_if() doesn't actually modify the size of your container. Also, vector::erase is linear, or O(n). | |
|
|
|
| A 06 (10) | |
| I tried using both @jumber007 and @ne55's methods but they didn't work. Isn't there any particular algorithm to solve problems like this? | |
|
|
|
| A 06 (10) | |
| @Duaos vector::erase is linear but i call it N times and N<=4000. So it is becoming very slow. | |
|
|
|
| Duoas (6734) | |
|
All I was saying is that remove_if() isn't going to perform any better than vector::erase(). I don't think you are going to find any magic STL answer to this one. You may have to reconsider how you do it. You might get a little better performance out of a std::deque for this. (I don't have any time to give it any more thought than that tonight, sorry.) | |
|
|
|
| rollie (304) | |||
|
Okay so your problem is the complexity of the erases. Here's an idea: Store your data in a vector, but don't erase anything. However, also store a set<int> of every checkout processed. Then, when you read a new checkout, you just increase it's value +1 for every checkout in that set<int> that is less than or equal to the checkout you just read.
Not 100% sure this is right, but I think something along these lines will work. | |||
|
|
|||
| Gulshan Singh (46) | |
| You should check out the remove erase idiom for faster erasing: http://en.wikipedia.org/wiki/Erase-remove_idiom | |
|
|
|
| rollie (304) | |
| @Gulshan OP is basically doing that already, the issue is the erase is expensive for a vector. | |
|
|
|
| Gulshan Singh (46) | |
| I thought the idiom moved the elements you don't want to the end and then erased them, which would be linear time for the search and then constant time for removal? | |
|
Last edited on
|
|
| rollie (304) | |
|
It's not constant, it's linear because it has to move all the elements after the erased one. http://www.cplusplus.com/reference/stl/vector/erase/ And you can't use remove/remove_if in this case, because there is no simple predicate you can provide to correctly identify the elements to erase - future removals are affected by previous ones. | |
|
Last edited on
|
|
| Gulshan Singh (46) | |
|
You don't have to shift elements if you've already moved it to the end. That's the key to the idiom. And I actually didn't read the question, I just saw an interesting discussion on remove/erase and wanted to jump in. | |
|
|
|
| rollie (304) | |
| For a vector, inserting to the end is not necessarily a free operation (and isn't performed in remove or remove_if), as the vector may have to be expanded. Also, say you erase aVector[5] in a 10 element vector - how would a user iterating over the elements know that the element at aVector[5] is now 'erased'? In order to erase an element from contiguous memory, all elements after it must be moved as well, making it O(n). | |
|
|
|
| cire (1851) | |||
|
One could store a little more information in the vector, such as whether or not the book has been loaned out or not and then just skip those when looking for the nth book... Something like:
No erasing required. | |||
|
Last edited on
|
|||
| ne555 (4041) | |
|
The problem is not the erasing. The problem is having to traverse the big array (M), for every element. That makes a total time of O(N*M) @rollie: that's kind of my suggestion. However your algorithm has a mistake, `newCheckoutIndex' should not exist. | |
|
|
|
| JLBorges (1336) | |||
|
> @ne555 > There is an easy O(N^2) were you consider only the taken books, (not sure if fast enough) > suppose that you want to borrow the K book, > If there is a J, with J<=K, that was borrowed, then you need to take the K+1 book. +1
Haven't thought about it deep enough, but just a hunch. Wouldn't something like a suffix tree or suffix array reduce it to O( N log N )? | |||
|
|
|||