# Dynamic Programming - Tackling Complex Problems

There are some optimization problems that are often practical to solve if the problem is small but become extremely time consuming as the problem grows larger.

Dynamic Programming, when applicable, gets around this issue with the following approach.

• The overall problem is divided into subproblems which, being smaller can be solved much faster.
• The solutions to the smallest subproblems are combined to form solutions to larger subproblems.
• The solutions to the larger subprobems are combined to form even larger subproblems.
• This process of solution and combination is repeated until we have a solution the the original problem.
• As solutions to subproblems are computed, the solutions are saved (cached) and if later the same subproblem needed to be solved, the answer is retrieved.

We will look at 3 problems of increasing complexity that illustrate the technique. The procedure is often subtle and varies somewhat with each problem but once you grasp the ideas, Dynamic Programming is not hard to use.

## The Fibonacci sequence

The first and simplest program generates a Ficonacci sequence. This is a series of integers where each one is the sum of the previous two. It is found in nature and can even form spirals. It can be expressed with a very simple recursive program.

``````  def fibonacci (n) :
if n < 2 : return n
else : return fibonacci(n-1) + fibonacci(n-2)``````

The program is very short but also very inefficient. Still, it illustrates how saving answers to sub-problems can speed things up if those answer are needed again (and again). As it stands calculating fibonacci(30) will also calculate fibonacci(29) twice, and lower numbers over and over. But if we stash results this all changes. Here we’ll use a dictionary to hold accomplish that.

``````known = {}

def fibonacci (n) :
ans =  known.get(n,None)
if ans   : return ans
if n < 2 : return n
else :
ans = fibonacci(n-2) + fibonacci(n-1)
known[n] = ans
return ans``````

The difference is huge. Here is a table with both methods side by side.

``````Fibonacci_1                      Fibonacci_2
1:     1                         1:     1
2:     1                         2:     1
3:     2                         3:     2
4:     3                         4:     3
5:     5                         5:     5
6:     8                         6:     8
7:    13                         7:    13
8:    21                         8:    21
9:    34                         9:    34
10:   55                         10:   55
...                              ...
27: 196418                       27: 196418
28: 317811                       28: 317811
29: 514229                       29: 514229
30: 832040                       30: 832040
calls: 7049122 - 1355.503ms      calls: 88 - 0.087 ms``````

Your cpu time may vary but a ratio of 1500 to 1 should still hold. The first version called itself over seven million times, the second version just 88.

Here is the link to fibonacci.py

And here is an interesting page on math-is-fun on the fibonacci sequence.

## Trail Maintainance

Our next problem is more challenging. It’s finding an optimum contiguous sequence within a set of numbers.

Imagine you are contracting to maintain one section of a hiking trail. You are paid a fixed amount for each kilometer of your section. Some parts of the trail are flat and profitable, others are on hills or go through woods. Sometimes your costs will exceed the payment. Here a table of showing the profitablity for each of the 15 kilometers of the trail. If postive you make money. If zero, you break even. If less than zero, you lose money. But since you must choose a single sequence then you may need to accept some losses if you can make them up later.

``````
profit for each kilometer
-3    0   -2    0    7    1   10    2  -12    0   -9    4   -4   10    2``````

Our program finds the start and end kilometers with the highest profit. To do this it will use the profit list to generate a promise list. Each value in the promise says:

Rather than start at the beginning (left to right) we’ll start at the right and work to the left. This way we solve small problems that get combined and used to solve bigger ones.

The profit list is used to generate a promise list. Every value in the promise list means the following.

This is the best profit to be found for a sequence starting here

Notice that there is no mention of where the sequence should end.

Here is the generated promise list.

``````
promise for each kilometer
15   18   18   20   20   13   12    2   -9    3    3   12    8   12    2
``````

If you study the profits and promises list together, you’ll begin to see patterns.

If we start at the last kilometer of the trail both profit and promise are 2 (for the one kilometer). Moving one step to the left gives us a profit of 12 (2+10). Taking the last three brings the profit down to 8. (12-4). And so on. Here is the function makePromises

``````
profits = [-3, 0, -2, 0, 7, 1, 10, 2, -12, 0, -9, 4, -4, 10, 2]
nsections = len(profits)

def makePromises(trail) :
promises = *nsections      # create array of same length
last = len(trail)-1           # last section
promises[last] = trail[last]  # ending value
for i in range(last-1, -1, -1) :
promises[i] = max(profits[i], profits[i]+promises[i+1])
return promises``````

The for loop marches from the end of the profits back to the beginning. The max call may take a bit of study. If promises[i+1] is negative, we’ll start a new end and cutting our loses. But if positive we’ll keep building on it.

Once we have the promises all that is needed is to find the entry with the highest promise. That’s our start point. Then we can proceed to the right adding up profits until we reach the promise value. That’s our end point. We’re done.

``````
def findSeq(promises) :
best = max(promises)
for start in range(nsections) :
if promises[start] == best : break

end = start; accum=profits[start]
while accum < best :
end += 1
accum += profits[end]
return start, end, best``````

Here is the link to trail.py.

## The Minimum Cost Path Problem

In the diagram below we have a tree with a start node at the top, followed by 6 rows of nodes. The last row is contains the leaf nodes. Every node has a [visitation-cost] associated with it. The problem is to find the minimum-cost path from a start node to any one of the leaf nodes. As you travel down you may choose to branch to the left or the right.

``````
01
07  05
08  03  05
04  05  07  04
07  06  02  03  07
02  09  08  02  02  01
06  02  06  02  01  07  09
``````

Since you make exactly six binary choices you have 26 or 64 unique paths.

One perfectly reasonable approach to finding the least cost path might be to simply generate the 64 numbers (from binary 000000 to 111111), and then traverse each corresponding path where a 0 means a left branch and a 1 a right branch. During each traversal we can total the cost of that path.

At the end you will have 64 paths and their corresponding cost. Simply select the path the minimum cost.

But with 14 rows instead of 7, there are 8192 paths. 100 rows a whopping 633825300114114700748351602688 paths. Take could take a while.

Clearly, our this approach is perfectly reasonable only when dealing with trees with just a few rows. For anything more something else is needed.

## Dynamic Solution for Minimum Path

In the trail problem we used lists for profits and promises. Here we need an extra dimension for a tree structure. That can be implemented with a list of lists.

Like the trail problem we will work from the end back to the beginning and turn the cost-tree into a promise-tree. The value of each node in the promise tree will be the lowest total cost to the bottom from that node.

Let’s start with a cost-tree with just two rows. You might notice that it’s the lower left corner of the tree above.

``````
02
06  02``````

There are just 2 paths with a total cost of either (2+6) or (2+2). Clearly we want the latter since we want the lowest cost.

Now we can make a promise tree that looks like this.

``````
promise tree

04          <-- lowest cost to the bottom
06  02
``````

where the bottom row doesn’t change but the start-nodes cost has been replaced with its promise cost.

With 3 rows we first compute all promises left to right in the 2nd row and then the promise for the start node from its cost plus the from the values (04 11) in the second row.

``````
Cost tree         Promise tree
07                  11       (7+4 =11)
02  09              04  11     (2+2=4  and 9+2=11)
06  02  06          06  02  06
``````

The bottom row of the promise-tree duplicates the bottom row of the cost-tree. All other rows are calculated from the bottom up. Here is a set of trees starting with the cost-tree and then row by row changing the tree into a promise-tree

``````         7th row promised             6th row promised
01                           01
07  05                       07  05
08  03  05                   08  03  05
04  05  07  04               04  05  07  04
07  06  02  03  07           07  06  02  03  07
02  09  08  02  02  01       04  11  10  03  03  08  <--
06  02  06  02  01  07  09   06  02  06  02  01  07  09

5th to 7th                   4th to 7th
01                           01
07  05                       07  05
08  03  05                   08  03  05
04  05  07  04               15  10  12  10  <--
11  16  05  06  10    <--    11  16  05  06  10
04  11  10  03  03  08       04  11  10  03  03  08
06  02  06  02  01  07  09   06  02  06  02  01  07  09

3rd to 7th                   2nd to 7th (1st too)
01                           19
07  05                       20  18  <--
18  13  15 <--               18  13  15
15  10  12  10               15  10  12  10
11  16  05  06  10           11  16  05  06  10
04  11  10  03  03  08       04  11  10  03  03  08
06  02  06  02  01  07  09   06  02  06  02  01  07  09
``````

## Program minpath.py

The trees, both cost and promise, are created as nested lists. Lists are handy since they are mutable and we can convert costs to promises on the fly in the same tree. This is what the tree above looks like in Python. On the left is a pretty-print format. The right is more useful for reading the code

``````
tree1 = [                            tree1 = [
[             1             ],     ,
[           7 , 5           ],     [7 , 5],
[         8 , 3 , 5         ],     [8 , 3 , 5],
[       4 , 5 , 7 , 4       ],     [4 , 5 , 7 , 4],
[     7 , 6 , 2 , 3 , 7     ],     [7 , 6 , 2 , 3 , 7],
[   2 , 9 , 8 , 2 , 2 , 1   ],     [2 , 9 , 8 , 2 , 2 , 1],
[ 6 , 2 , 6 , 2 , 1 , 7 , 9 ]]     [6 , 2 , 6 , 2 , 1 , 7 , 9 ]]
``````

At line 29 the for loop works rows from the 2nd to bottom up to the top (row 0 in Python). The inner for loop at line 30 works across the columns. The least of the left and right promise values is added to a nodes cost value to set the promise value (line 34)

`{.literal-block} 26 def minPath(tree) : 27 printTree(tree) 28 nrows = len(tree) 29 for row in range(nrows-2, -1, -1) : 30 for col in range(0, len(tree[row])) : 31 left = tree[row+1][col] 32 right= tree[row+1][col+1] 33 least = min(left,right) 34 tree[row][col] += least 35 printTree(tree) 36 print("Minimum path %s" % tree) 37` The function printTree generates the graphs of cost-trees, promise-trees or those partially promised.

``````38 def printTree(tree) :
39     nrows = len(tree)
40     indent = nrows
41     for row in tree :
42         prow = [" %02d " % val for val in row]
43         line = "".join(prow)
44         print("%s %s" % ("  "*indent, line))
45         indent -= 1
46     print("")
``````

When you run minpath.py you can watch solution come together from the bottom up, one row at a time.

``````\$ python minpath.py
01
07  05
08  03  05
04  05  07  04
07  06  02  03  07
02  09  08  02  02  01
06  02  06  02  01  07  09

01
07  05
08  03  05
04  05  07  04
07  06  02  03  07
04  11  10  03  03  08    <-- this row updated first
06  02  06  02  01  07  09

six more until

19
20  18
18  13  15
15  10  12  10
11  16  05  06  10
04  11  10  03  03  08
06  02  06  02  01  07  09

Minimum path 19``````

Finally, just knowing the promised cost at any particular node doesn’t directly tell us what path to take. But it is simple. From any node choose the lesser promised cost in the row below and proceed in that direction.

## Comparison of Trail and Minimum Path

The Minimum path problem is almost a two-dimensional version of the trail problem above. There we saw that the number of possible sections was on the order of O(n2) where n is the length of the trail. Without dynamic programming, doubling the length quadruples the work. In the Minimum path problem adding just one more row to the tree doubles the number of paths. Without dynamic programming the work would be O(2n).

You can download the zip file for the project here.

If you have comments or suggestions You can email me at mail me

Copyright © 2014-2021 Chris Meyers and Fred Obermann

* * *