We start by discussing what was left from last week - the 10th Open Cup round (problems, results, top 5 on the left). The announcement of its results was delayed because of issues in the following problem (problem B in the statements): you are given the n integer lengths, and need to construct any convex polygon with the given side lengths. This is a nice, if standard, problem. The most common approach is to build a polygon that is inscribed in a circle, and find the radius of such circle using binary search (if all sides covered more than 2π, increase the radius, otherwise decrease it; handle the case where one side covers more than π carefully). The devil is in the details: as there were up to 10

This week had just one contest: TopCoder SRM 611 (problems, results, top 5 on the left). I've skipped it, so I won't go into detail about its problems.

Instead, let's talk about last week's medium problem that I mentioned in my previous post: you are given several tasks, i-th task consuming a

Such problems are often solved by considering the following natural question: consider the final order in which we are solving the tasks. In order for the solution to be optimal, swapping two adjacent tasks in that order should not make it better (for some meaning of "better"). So let's see what happens when we swap two adjacent tasks in that order.

Let's say the first task consumes a

Having made this observation, the rest is relatively easy: we can do Dynamic Progamming using (number of tasks solved, current amount of energy) as the state.

This simple and somewhat obvious idea - to consider what happens when we swap two tasks - is often crucial to solving this kind of problem. Sometimes it leads to a solution outright, and sometimes it can give a clue to what the next step is.

Thanks for reading, and see you next week! Also, here's a sunny Moscow landscape I see - spring is here!

^{5}sides of length up to 10^{5}, the radius of the circle had to be more than 10^{9}which led to awful precision issues given that a precision of 10^{-5}was required in the output, and the jury checker program was not free of such issues, too. I think this is a common mistake that contest authors do: they set the problem's limits so that their solution barely runs in time/space/precision. Instead, when setting the limits, one should set them by taking into account both the correct solutions and possible wrong solutions. In this problem, I don't see any wrong solution of polynomial complexity, so having just 100 sides of length up to 100 would make the problem much better because the precision issues would go away and one would still need to come up with the binary search and properly handle the corner cases.This week had just one contest: TopCoder SRM 611 (problems, results, top 5 on the left). I've skipped it, so I won't go into detail about its problems.

Instead, let's talk about last week's medium problem that I mentioned in my previous post: you are given several tasks, i-th task consuming a

_{i}energy when started and returning b_{i}energy when completed (but energy is not allowed to become negative while performing a task), b_{i}< a_{i}. What is the maximum number of tasks you can perform given your starting energy?Such problems are often solved by considering the following natural question: consider the final order in which we are solving the tasks. In order for the solution to be optimal, swapping two adjacent tasks in that order should not make it better (for some meaning of "better"). So let's see what happens when we swap two adjacent tasks in that order.

Let's say the first task consumes a

_{1}energy and returns b_{1}energy, and the second task consumes a_{2}energy and returns b_{2}energy. For the first task to be doable, we need at least a_{1}energy, and the amount of energy will decrease by a_{1}-b_{1}. For the second task to be doable after the first task, we need to have at least a_{2}+a_{1}-b_{1}energy in the beginning (since a_{1}-b_{1}was already spent on first task). If we do them in the reverse order, then by the same argument we see that we need to have at least a_{1}+a_{2}-b_{2}energy in the beginning. The a_{1}+a_{2 }part is the same in both constraints, so the only difference is -b_{1}vs -b_{2}. We can see that if b_{1}>=b_{2}, then -b_{1}<=-b_{2}and thus a_{2}+a_{1}-b_{1}<=a_{1}+a_{2}-b_{2}, and doing the second task after the first task has lower or equal requirement on the amount of starting energy than doing them in the reverse order, so it makes no sense to do them in the reverse order. That mean that we can sort the tasks in non-increasing order of b_{i}, breaking ties arbitrarily, and then only consider solving them in the sorted order.Having made this observation, the rest is relatively easy: we can do Dynamic Progamming using (number of tasks solved, current amount of energy) as the state.

This simple and somewhat obvious idea - to consider what happens when we swap two tasks - is often crucial to solving this kind of problem. Sometimes it leads to a solution outright, and sometimes it can give a clue to what the next step is.

Thanks for reading, and see you next week! Also, here's a sunny Moscow landscape I see - spring is here!

## No comments:

## Post a Comment