动态规划课件

合集下载
  1. 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
  2. 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
  3. 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。

o o
c u c
r
r r
a n c r
e
e
c u
e n c
5 mismatches, 1 gap
Key concept used by Search Engines to correct spelling errors. You get a response:
o o
c c
-
u
r r
r r
a n c e n c
0 if i 0 OPT (i, w) OPT (i 1, w) if w i w max OPT (i 1, w), vi OPT (i 1, w wi ) otherwise


8
Knapsack Problem: Bottom-Up
Knapsack. Fill up an n-by-W array.
Input: n, w1,…,wN, v1,…,vN for w = 0 to W M[0, w] = 0
for i = 1 to n for w = 1 to W if (wi > w) M[i, w] = M[i-1, w] else M[i, w] = max {M[i-1, w], vi + M[i-1, w-wi ]}
Edit Distance
Applications.

Basis for Unix diff: determine how similar two files are. Plagiarism detection: “document similarity” is a good indicator of whether a document was plagiarized. Speech recognition. Sound pattern corresponding to a spoken word is matched against the sound pattern corresponding to various standard spoken words in a data base. Computational biology.
u
r r
r r
a n c e n c
e e
c u
1 mismatch, 1 gap
o o
c c
-
u
r r
r r
e
a n c n c
e e
c u
0 mismatches, 3 gaps
13
String Similarity
How similar are two strings? ocurrance Occurrence
return M[n, W]
9
Knapsack Algorithm
W+1
0
1
2
3
4
5
6
7
8
9
10
11

{1}
n+1
0
0 0 0 0 0
0
1 1 1 1 1
0
1 6 6 6 6
0
1 7 7 7 7
0
1 7 7 7 7
0
1 7 18 18 18
0
1 7 19 22 22
0
1 7 24 24 28
0
1 7 25 28 29
Weight 1 2 5 6 7
10
Knapsack Problem: Running Time
Running time. O (n W) How much storage is needed? The table we used above is of size O(n W). For example, if W = 1000000 and n = 100, we need a total of 400 Mbytes (106 x 100 x 4 bytes). This is too high.
The space requirement can be reduced: once row j has been computed, we don’t need rows j – 1 and below.
Space reduces to 2W = 8 Mbytes in the above case.

Case 1: OPT does not select item i. OPT selects best of { 1, 2, …, i-1 } using weight limit w Case 2: OPT selects item i. new weight limit = w – wi OPT selects best of { 1, 2, …, i–1 } using this new weight limit
11
6.6 Sequence Alignment
String Similarity
How similar are two strings? ocurrance Occurrence

o o
c u c
r
r r
a n c r
e
e
c u
e n c
5 mismatches, 1 gap
o o
c c
-
6
Dynamic Programming: False Start
Def. OPT(i) = max profit subset of items 1, …, i.

Case 1: OPT does not select item i. OPT selects best of { 1, 2, …, i-1 }
e e
c u
1 mismatch, 1 gap
“Did you mean xxxxx ?”
o o c c u r r r r e a n c n c e e c u
0 mismatches, 3 gaps
14
Google search for “allgorythm”
When a word is not in the dictionary, the search engine searches through all words in data base and finds the one that is closest to “allgorythm” and asks if you probably meant it.
2
Dynamic Programming History
Bellman. Pioneered the systematic study of dynamic programming in the 1950s. Etymology. Dynamic programming = planning over time. Secretary of Defense was hostile to mathematical research. Bellman sought an impressive name to avoid confrontation. "it's impossible to use dynamic in a pejorative sense" "something not even a Congressman could object to"
“we now turn to a more powerful and subtle design technique, dynamic programming. It'll be simpler to day precisely what characterizes dynamic programming after we have seen it in action, but the basic idea is drawn from the intuition behind divide-and-conquer ad is essentially the opposite of the greedy strategy: one implicitly explores the space of all possible solutions, by carefully decomposing things into a series of subproblems, and then building up correct solutions to larger and larger subproblems. In a way, we can thus view dynamic programming as operating dangerously close to the edge of brute-force search.”

Reference: Bellman, R. E. Eye of the Hurricane, An Autobiography.
3
Dynamic Programming Applications
Areas

Bioinformatics. Control theory. Information theory. Operations research (e.g. knapsack problem) Computer science: theory, graphics, AI, systems, ….
Chapter 6 Dynamic Programming
Algorithmic Paradigms
Greed. Build up a solution incrementally, myopically optimizing some local criterion. Divide-and-conquer. Break up a problem into two sub-problems, solve each sub-problem independently, and combine solution to subproblems to form solution to original problem. Dynamic programming. Break up a problem into a series of overlapping sub-problems, and build up solutions to larger and larger sub-problems.
Some famous dynamic programming algorithms

Viterbi’s algorithm for hidden Markov models. Unix diff for comparing two files. Smith-Waterman for sequence alignment. Bellman-Ford for shortest path routing in networks. Cocke-Kasami-Younger for parsing context free grammars.
Conclusion. Need more sub-problems!
7
Dynamic Programming: Adding a New Variable
Def. OPT(i, w) = max profit subset of items 1, …, i with weight limit w.

Case 2: OPT selects item i. accepting item i does not immediately imply that we will have to reject other items without knowing what other items were selected before i, we don't even know if we have enough room for i


wk.baidu.com
16
Plagiarism checker – a simple test
Here a paragraph from Chapter 6 of the text (current chapter on Dynamic Programming) and made some small changes:
0
1 7 25 29 34
0
1 7 25 29 34
0
1 7 25 40 40
{ 1, 2 } { 1, 2, 3 } { 1, 2, 3, 4 } { 1, 2, 3, 4, 5 }
Item 1 OPT: { 4, 3 } value = 22 + 18 = 40 2 W = 11 3 4 5
Value 1 6 18 22 28

Ex: { 3, 4 } has value 40.
W = 11
Item
Value
Weight
1 2
3 4 5
1 6
18 22 28
1 2
5 6 7
Greedy: repeatedly add item with maximum ratio vi / wi. Ex: { 5, 2, 1 } achieves only value = 35 greedy not optimal.
4
6.4 Knapsack Problem
Knapsack Problem
Knapsack problem. Given n objects and a "knapsack." Item i weighs wi > 0 kilograms and has value vi > 0. Knapsack has capacity of W kilograms. Goal: fill knapsack so as to maximize total value.
相关文档
最新文档