Algorithm Analysis (Big O)

合集下载
  1. 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
  2. 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
  3. 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。
Algorithm Analysis (Big O)
1
Complexity
In examining algorithm efficiency we must understand the idea of complexity
▪ Space complexity ▪ Time Complexity
F.C. n+1 n2+n n2 n2 ____ 3n2+2n+1
Big O = O(n2)
clearing coefficients : n2+n
picking the most significant term: n2
10
What is Big O
Big O
▪ rate at which algorithm performance degrades as a function of the amount of data it is asked to handle
f(n)=O(g(n)), iff there exist constants c and n0
such that:
f(n) <= c g(n) for all n>=n0
Thus, g(n) is an upper bound on f(n)
Note: f(n) = O(g(n)) is NOT the same as O(g(n)) = f(n)
n+1
1
{ cout << i;
n
2
p = p + i;
n
3
}
____
3n+1
totaling the counts produces the F.C. (frequency count)
8
Order of magnitude
In the previous example:
▪ best_case = avg_case = worst_case ▪ Example is based on fixed iteration n
▪ discard constant terms ▪ disregard coefficients ▪ pick the most significant term Worst case path through algorithm ->
▪ order of magnitude will be Big O (i.e. O(n))
2
Space Complexity
When memory was expensive we focused on making programs as space efficient as possible and developed schemes to make memory appear larger than it really was (virtual memory and memory paging schemes)
By itself, Freq. Count is relatively meaningless Order of magnitude -> estimate of performance vs.
amount of data To convert F.C. to order of magnitude:
The '=' is not the usual mathematical operator "=" (it is not reflexive)
13
Big-O Notation
Comparing Algorithms and
ADT Data Structures
14
Algorithm Efficiency
9
Another example
Inst # Code
1 for (int i=0; i< n ; i++)
2
for int j=0 ; j < n; j++)
3
{ cout << i;
4
p = p + i;
}
F.C. n+1 n(n+1) n*n n*n
discarding constant terms produces : 3n2+2n
is an intrinsic property of the algorithm
▪ independent of particular machine or code
based on number of instructions executed for some algorithms is data-dependent meaningful for “large” problem sizes
Space complexity is still important in the field of embedded computing (hand held computer based equipment like cell phones, palm devices, etc)
3
Time Complexity
4
Algorithm Efficiency
a measure of the amount of resources consumed in solving a problem of size n
▪ time ▪ space
Benchmarking: implement algorithm,
▪ run with some specific input and measure time taken ▪ better for comparing performance of processors than for
16
Computing xn for n >= 0
iterative definition
▪ x * x * x .. * x (n times)
recursive definition
▪ x0 = 1
▪ xn = x * xn-1 (for n > 0)
another recursive definition
For example:
▪ O(n) -> performance degrades at a linear rate O(n2) -> quadratic degradation
11
Common growth rates
12
Big Oh - Formal
Definition
Definition of "big oh":
a measure of the amount of resources consumed in solving a problem of size n
▪ time ▪ space
benchmarking – code the algorithm, run it with some specific input and measure time taken
number of instructions to be executed
e.g.
for each instruction predict how many times each will be encountered as the code runs
Inst
Code
F.C.
#
for (int i=0; i< n ; i++)
18
Recursive Power function
double RecPow (double X, int N) { if (N == 0) return 1; else return X * RecPow(X, N - 1);
}
Base case 1 1
total: 2
Recursive case 1
comparing performance of algorithms
Big Oh (asymptotic analysis)
▪ associates n, the problem size, ▪ with t, the processing time required to solve the problem
Worst case ▪ executing the algorithm produces path lengths that are always a maximum
6
Worst case analysis
Of the three cases, only useful case (from the standpoint of program design) is that of the worst case.
Is the algorithm “fast enough” for my needs
How much longer will the algorithm take if I increase the amount of data it must process
Given a set of algorithms that accomplish the same thing, which is the right one to choose
15
big Oh
measures an algorithm’s growth rate
▪ how fast does the time required for an algorithm to execute increase as the size of the problem increases?
Worst case helps answer the software lifecycle question of:
▪ If its good eห้องสมุดไป่ตู้ough today, will it be good enough tomorrow?
7
Frequency Count
examine a piece of code and predict the
▪ x0 = 1
▪ xn = (xn/2)2
(for n > 0 and n is even)
▪ xn = x * (xn/2)2 (for n > 0 and n is odd)
17
Iterative Power function
double IterPow (double X, int N) { double Result = 1; while (N > 0) { Result *= X; N--; { return Result;
5
Cases to examine
Best case ▪ if the algorithm is executed, the fewest number of instructions are executed
Average case ▪ executing the algorithm produces path lengths that will on average be the same
1 + T(n-1) 2 + T(n-1)
Number of times base case is executed:
1
Number of times recursive case is executed: n
Algorithm's computing time (t) as a function of n is: 2n + 2 O[2n + 2] is n
} Total instruction count:
1 n+1
n n
1
3n+3
critical region
algorithm's computing time (t) as a function of n is: 3n + 3
t is on the order of f(n) - O[f(n)]
O[3n + 3] is n
▪ better for measuring and comparing the performance of processors than for measuring and comparing the performance of algorithms
Big Oh (asymptotic analysis) provides a formula that associates n, the problem size, with t, the processing time required to solve the problem
相关文档
最新文档