Editorial for Baltic OI '07 P6 - Sequence
Submitting an official solution before solving the problem yourself is a bannable offence.
Simple Observations
The definition of the reduce operation in the problem statement could be rephrased in the following way. We have points on a line (corresponding to the elements of the sequence), so there are line segments connecting pairs of adjacent points. In the beginning, all line segments are erased, so each point forms a single set. A single reduction is equivalent to drawing back one of these line segments, what results in joining the sets that contain the points corresponding to the ends of this line segment; the cost of this operation is equal to the maximal element of the resulting set. This interpretation of the reduce operation will simplify some of the following solutions' descriptions (as in this version the sequence never becomes shorter).
Dynamic Programming Solution
A segment of a sequence is its contiguous subsequence, i.e. . In the simplest solution for each segment of the sequence we count the minimal cost of reducing it into a single element (let us denote this value by for the segment ). Obviously, for every we have . If then the last operation in the process of reducing this segment to a single element always costs ; in our solution, we can only choose the position of this reduction, which is in other words the position of the line segment drawn during this reduction. That is why for formula holds. Using this formula we can compute all values of array from the shortest to the longest segments. The total time complexity of this solution is , because there are fields of array and computing each of them requires time. This solution scores points out of .
A Greedy Observation
To achieve a better time complexity of our solution we need to introduce some greediness. Let us assume for simplicity that . We claim that performing repetitions of the following step:
find such that
if () then reduce() else reduce()
gives us an optimal reduction scheme for our sequence.
Proof
Firstly we need to prove that in each of the steps finding a required value of is possible. In each step let us start with and while the condition holds let us increase the value of by one. This process will certainly terminate because (since ). The value of in the moment of termination is an example of an index that we were looking for.
Now let us find out why this reduction scheme is optimal. We will prove this by induction on the value of . If then this solution is obviously optimal with cost . If then we need to prove that there exists an optimal reduction scheme of our sequence in which the reduction determined by our greedy step is done in the beginning. Let us assume that (the other case is analogical). Let be any optimal reduction scheme of our sequence in which the reduction between and is not the first reduction (if no such scheme exists, then we are done). Let us consider two reductions: the one defined by the line segment between and (we will call it ) and the one between and (called ). If none of the reductions and is first in then we can move the first of them that appears in (let be this reduction) to the beginning of . After performing this operation, the cost of does not increase. Why? Firstly, the cost of can only decrease or stay the same (the sooner a reduction is performed, the lower is its cost). Secondly, performing this reduction earlier will not change the cost of any reductions performed in before the second of reductions and , because inserting to a set containing either or does not change the maximal element of this set and only reductions involving this set could change their costs. On the other hand, after both and were performed, the costs of all following reductions will not change (all three elements , and will be in a single set since then).
If = then we are done now. In the opposite case, we will prove that swapping (which is now in the beginning of ) and decreases the cost of (what will give a contradiction with optimality of ). Let us notice that after this swap the cost of all reductions apart from and in is exactly the same as in the beginning (the argument is similar to the one from the previous fragment of the proof). The second of those reductions in both cases (before and after the swap) costs exactly the same, as its resulting set is the same. So the change of cost of is , what gives us the desired contradiction.
Implementations of the Greedy Method
A straightforward implementation of the greedy approach (performing simply one step after another on a sequence represented as an array of integers) leads to an solution. This solution scores points out of .
An implementation of this algorithm with complexity can be obtained. In this solution, the sequence will be analyzed from to . Throughout the algorithm, a stack containing a decreasing subsequence of is maintained. In the beginning, the stack contains only values and . In each of the following steps (for ), if is smaller than the element on the top of the stack, then is simply pushed on the stack, and in the opposite case, all possible reductions are made. In other words, if the element from the top of the stack is not greater than , then this element is reduced with smaller of the elements: and the element second from the top of the stack. This operation pops the element from the top of the stack and the process continues. This loop stops at one of the conditions:
if and the stack contains only two elements, one of which must be — this terminates the algorithm, the sequence was reduced to a single element,
if is smaller than the element on the top of the stack (in this case is pushed on the stack) — this terminates the loop in the case ; there will surely be such a moment in the loop for every , because .
Even though a single step of the algorithm (for a given ) may take operations, the whole algorithm has complexity . This is true because the number of iterations of the inner loop of the algorithm is bounded by the total number of elements that are pushed on the stack, and the latter is because in every step only one element is pushed on the stack.
Comments