Get Time
statistics_w  Match Editorial
SRM 433
Wednesday, January 21st, 2009

Match summary

SRM 433 had an unbelievable challenge phase - pmnox outchallenged Petr and won the match by a mere 5 points. TopCoder titans ACRush and misof were a hundred points behind. In Division 2, fatrat1695 dominated the field winning by more than 100 points, with veteran Uranium-235 and NewForce coming second and third, respectively. The Division 2 - 1000 problem appeared to be quite hard, with only 12 correct submissions.

The Problems

RoyalTreasurer rate it discuss it
Used as: Division Two - Level One:
Value 250
Submission Rate 768 / 824 (93.20%)
Success Rate 728 / 768 (94.79%)
High Score devil911 for 249.98 points (0 mins 13 secs)
Average Score 220.79 (for 728 correct submissions)

This was a pretty classical problem. If you want to minimize the sum of Ai * Bi, you should multiply the smallest Ai by the largest Bi, the second-smallest Ai by the second-largest Bi, and so on. In other words, if Bi > Bj, then Ai must not be greater than Aj (you can prove this by contradiction). The java code follows:

	public int minimalArrangement(int[] A, int[] B) {



		int res = 0;

		for (int i=0; i < A.length; i++)

			res += A[i] * B[B.length - i - 1];

		return res;


MagicWords rate it discuss it
Used as: Division Two - Level Two:
Value 500
Submission Rate 276 / 824 (33.50%)
Success Rate 71 / 276 (25.72%)
High Score shyym for 441.43 points (10 mins 38 secs)
Average Score 251.21 (for 71 correct submissions)
Used as: Division One - Level One:
Value 250
Submission Rate 542 / 637 (85.09%)
Success Rate 328 / 542 (60.52%)
High Score jzd for 243.28 points (4 mins 44 secs)
Average Score 161.26 (for 328 correct submissions)

This problem was harder than a usual problem of this slot. Nevertheless, there are at least 3 solutions which are guaranteed to pass any test under the given constraints.

  1. Let us call a string S which has exactly K shifts equal to itself a K-magic string. You may notice that string S being K-magic is equivalent to "string S can be divided into K equal substrings and cannot be divided into X > K equal substrings". For example, string S = "ABCABCABCABC" is 4-magic because it consists of 4 "ABC"-s and can not be divided into a larger number of equal substrings.

    Therefore, to check whether string S of length L is K-magic you must first check if S is a concatenation of K equal substrings. If it doesn't, then S definitely is not K-magic. If it does, S can still possibly be divisible into X > K substrings. To ensure S is a K-magic string, S must not be a multiple of a substring of length L / i (for each i between (K + 1) and L). Each this check can be done in O(L) if i divides L, and in constant time if not, which gives us the total estimation of O(L * sqrt(L)). Multiplying this by the number of permutations of the input strings, we get the final time estimation at O(N! * L * sqrt(L)). nika's solution is an example of this approach.

  2. Another approach was to use a KMP or some other string searching algorithm. After choosing a permutation from N! possibilities and obtaining a concatenation S (with length L) of the permuted strings, take string S2=S+S. The number of indices i (0 <= i < L) such that S = S2.substr(i,L) is exactly the number of shifts of S equal to the original. Using a Knuth-Morris-Pratt algorithm, you can find this number in O(L). So the overall complexity is O(N! * L).
  3. You also may write an inefficient algorithm and optimize it using the following trick. Let us permute the input strings in order {2,3,0,1} and concatenate to receice string S. If S is K-magic, then concatenations of input for permutations {3,0,1,2}, {0,1,2,3}, {1,2,3,0} will be K-magic as well. (all these concatenations are cyclic shifts of each other). Therefore, your solution may check only permutations of the input which leave the first element of input on its place, and then multiply the result by the total number of elements in the input.
Besides those approaches, many coders had solutions using hash, memoization or other coding optimizations.

MakingPotions rate it discuss it
Used as: Division Two - Level Three:
Value 1000
Submission Rate 63 / 824 (7.65%)
Success Rate 12 / 63 (19.05%)
High Score osan for 691.84 points (21 mins 2 secs)
Average Score 476.86 (for 12 correct submissions)

This problem splits into 2 independent parts: parsing the input into some data structure (should be easy for most coders trying to solve the problems of this level) and computing the price of LOVE. We'll concentrate on the latter.

Computing the price of any potion should be simple - either the potion could be bought on the market, or there is a recipe which results in that potion. Unfortunately, you need to compute the prices of recipes in a specific order - for example, if you need WATER to get COFFEE, then you can not compute the price of COFFEE until you know the price of WATER. To make things worse, the dependencies may be cyclic, so you can not topologically sort the potions to get the order of computations.

Fortunately, the constraints on this problem are quite low, so we can try using the following algorithm:

  1. Set the price of each potion to the market price (or to infinity, if the potion is unavailable on the market).
  2. For each recipe, compute the price of the resulting potion using the current prices of all ingredients. Update the current price of the resulting potion if needed.
  3. Return to point 2 if needed.

In general, the algorithm is not that hard. The only problem is "if needed" part. How do we know whether we've achieved the optimal price? The answer to this question is very simple: if (in point 2) we tried all recipes and didn't update the price for any of the resulting components, then we've reached the optimum. See the fastest solution for an example of implementation (note that osan didn't bother with checking for update on each loop).

SettingTents rate it discuss it
Used as: Division One - Level Two:
Value 500
Submission Rate 121 / 637 (19.00%)
Success Rate 91 / 121 (75.21%)
High Score Petr for 451.72 points (9 mins 29 secs)
Average Score 281.67 (for 91 correct submissions)

This problem is quite similar to IsoscelesTriangle, and the approaches are quite similar.

The solution splits into 2 parts: a) checking all possible shapes of rhombs and b) computing the number of ways such a rhomb can be placed inside the designated area (if your implementation is not careful enough, then you have also part c) - make sure you dodn't count any rhomb more than once). Subproblem b) is quite simple - you compute the dimensions Bx*By of the bounding box of the rhomb, and the number of ways the rhomb can be placed inside the N*M area is either (N - Bx + 1)*(M - By + 1), or zero (if Bx > N or By > M).

Lets find a way to check all possible rhomb shapes. To do that, lets start from the bottom vertex A of the rhomb (if a rhomb has 2 bottom vertices at the same level, choose the leftmost of them). Now you need to check all possible directions and lenghts of the 2 edges coming out of the vertex A. Checking all of them in a brute-force way will be too slow, since there are 100*100 = 10000 possibilities for each edge. To optimize the search, we do the following trick: check all possibilities for the left of two edges (10000 possibilities), and then start 'rotating' it clockwise to check all possibilities for the second edge (they must be of the same length).

'Rotation' can be implemented in the following way. Let the first edge be (x, y). We start incrementing x, checking the length of the new edge and adjusting it when needed (increasing y if the edge is too short and decreasing if the edge is too long). If the length of the edge is equal to the length of the first edge, then these edges correspond to a rhomb. We check the number of ways this rhomb can be placed inside the area (see subproblem b) above), and return the total for all possible directions of the first edge. The time complexity of this solution is O(N^3) - checking all possible directions on the first edge takes O(N^2) and checking all rotations of the edge can be done in linear time.

BarbarianInvasion rate it discuss it
Used as: Division One - Level Three:
Value 1000
Submission Rate 24 / 637 (3.77%)
Success Rate 11 / 24 (45.83%)
High Score pmnox for 809.92 points (14 mins 29 secs)
Average Score 619.75 (for 11 correct submissions)

This problem is solved by maximum flow algotithm, so unexperienced coders may want to read the tutorial before going further.

Lets start by solving easier versions of the problem. For example, what if all detachments would be of the same size? This is a classical minimal cut problem. Indeed - think of all barbarian lands outside the country as of source, the capital will be the sink, and all cells become the intermediate vertices of the graph. Two vertices in that graph will be connected only if the enemy can move directly between the corresponding cells on the map. Since the flow can be limited in the cells only(the barbarians can be stopped in the cells only, not between cells), we need to eliminate vertex capacities as it shown in the tutorial (see Figure 7). Lets call the edges added on this step (edges between vertices of the same color on Figure 7 in the tutorial) auxiliary edges. The capacities should be set to 1 for all auxiliary edges, and to infinity (replaced by a very big integer number) for all other edges. Now the answer of this simplified problem will be the size of the minimal cut in the graph.

Lets now try another simplification. Forget about "the minimal number of detachments" requirement, and try to find the minimal total size of all detachments needed to protect the capital (now detachments may be of different sizes). This version differs only slightly from the previous one - the capacity of each auxiliary edge will be changed to the size of the detachment needed to block the corresponding cell, and the capacities of all non-auxiliary edges remain infinite.

And the last step - we are to change the problem solved in the previous paragraph and minimize the total number of detachments used prior to minimizing the total size of all detachments. So, using X detachments of total size S1 should be more expensive than using (X - 1) detachment of total size S2, even if S2 > S1. In other words, using an extra detachment should be very expensive, and we can simulate this by adding a very big number T to the sizes of ALL detachments. If the mincut in the corresponding graph will be equal to M, this will mean we needed [T / M] detachments of total size (T % M).

See pmnox's solution for an example of this approach.

By Olexiy
TopCoder Member