### Introduction

The general four-step process for dynamic programming (see Cormen):

• Characterize the structure of an optimal solution.
• What are the subproblems? If you were able to solve the subproblems, how would you use them to solve the given problem?
• What would be the smallest or easiest subproblem to solve? That is, what is the base case?
• Recursively define the value of an optimal solution.
• Compute the value of an optimal solution in a bottom-up fashion.
• If needed, construct an optimal solution from the computed information.

### Applications of Dynamic Programming

Dynamic programming works very well for many examples that show up frequently in ACM Programming Contests:
• Knapsack problems - You have a bag with maximum weight Wmax, and a bunch of items, each with a value v and weight w. Which items should you choose so as to optimize the value of the items and keep the weight under Wmax? (Compare dynamic programming to other methods. Greedy algorithms can sometimes lead to suboptimal solutions. Exhaustive search may take too long - over the runtime limit.) Other problems that are knapsack-like include:
• Making change: The problem giving correct change using the minimal number of coins
• Stacking boxes: Create a structure with optimal height using the minimal number of boxes
• Various problems involving sequences:
• Longest Common Subsequence (LCS) - finding the longest common contiguous subsequence between two given sequences.
• Longest increasing subsequence - finding a subsequence (not necessarily contiguous) of maximum length where the values are strictly increasing in value.
• Partitioning problems
• Balanced integer partitioning - Given a set of integers, partition them into two sets that are optimally balanced - that is, |S1-S2| is minimized where S1 and S2 are the sums of integers in each set respectively. An example of this problem - you are given some set of N files that have sizes measured in bytes, and you want to fit them evenly across two disks. Or you want to know if it's even possible to fit them across 2 disks of given sizes.
• Fair division of labor - Similar to the balanced partitioning problem - divide some set of cities into two sets so that the sum of the distance traveled by one individual is as close as possible to the distance traveled by the other individual.
• All-Pairs Shortest Path (Floyd-Warshall Algorithm) - Given some set of cities and distances between them, find the shortest path from one node to another.
• There are other useful modifications on Floyd-Warshall Algorithm such as Transitive closure, Minimax path, Maximin path, Safest path (based on survival probabilities from node to node).
• Edit distance - Given two strings A and B, compute the minimum cost to transform A into B using simple operations like insert, delete, and replace.
• Optimal binary trees - Given a collection of nodes and the probability of occurrence of the keys, produce a tree that is balanced in such a way that the cost of finding a node is at most log n.
• Matrix chain multiplication - Given a list of matrices, decide the order in which to do the multiplications which is most efficient
• Binomial Coefficients (combinations) - Solved using dynamic programming and Pascal's Triangle. Problems involving many computations of binomial coefficients may benefit from computing and keeping Pascal's triangle around in a cache somewhere.
• Catalan Numbers can be found iteratively (see below for an example). Many problems in combinatorics reduce to finding Catalan numbers.