




















Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
This is first lecture on Analysis of Algorithms. What are algorithms, what are primitive operations and how to count them.
Typology: Slides
1 / 28
This page cannot be seen from the preview
Don't miss anything!
An algorithm is a step-by-step procedure for solving a problem in a finite amount of time.
Most algorithms transform input objects into output objects. The running time of an algorithm typically grows with the input size. Average case time is often difficult to determine. We focus on the worst case running time. (^) Easier to analyze (^) Crucial to applications such as games, finance and robotics 0 20 40 60 80 100 120 Running Time 1000 2000 3000 4000 Input Size best case average case worst case
Control flow (^) if … then … [ else …] (^) while … do … (^) repeat … until … (^) for … do … (^) Indentation replaces braces Method declaration Algorithm method ( arg [, arg …]) Input … Output … Method call var.method ( arg [, arg …]) Return value return expression Expressions (^) Assignment (like in Java) (^) Equality testing (like in Java) n^2 Superscripts and other mathematical formatting allowed
Memory cells are numbered and accessing any cell in memory takes unit time.
By inspecting the pseudocode, we can determine the maximum number of primitive operations executed by an algorithm, as a function of the input size Algorithm arrayMax ( A , n ) currentMax A [0] 2 for i 1 to n 1 do 1 n if A [ i ] currentMax then 2( n 1) currentMax A [ i ] 2( n 1) { increment counter i } 2( n 1) return currentMax 1 Total 7 n 2
Algorithm arrayMax executes 7 n 2 primitive operations in the worst case. Define: a = Time taken by the fastest primitive operation b = Time taken by the slowest primitive operation Let T ( n ) be worst-case time of arrayMax. Then a (7 n 2 ) T ( n ) b (7 n 2 ) Hence, the running time T ( n ) is bounded by two linear functions
Growth rates of functions: (^) Linear n (^) Quadratic n 2 (^) Cubic n 3
constant factors or lower-order terms
102 n 105 is a linear function 105 n^2 108 n is a quadratic function
Example: the function n^2 is not O ( n ) (^) n (^2) cn (^) n c (^) The above inequality cannot be satisfied since c must be a constant
(^) 7n-2 is O(n) need to find two positive constants: c, n 0 , such that for all n n 0 7n-2 c•n c = 7 and n 0 = 1 (^) 3n^3 + 20n^2 + 5 is O(n3) need to find two positive constants: c, n 0 , such that for all n n 0 3n 3
(^) Say “ 2 n is O ( n )” instead of “ 2 n is O ( n (^2) )”
(^) Say “ 3 n 5 is O ( n )” instead of “ 3 n 5 is O (3 n )”
The asymptotic analysis (as n grows toward infinity) of an algorithm determines the running time in big-Oh notation To perform the asymptotic analysis (^) We find the worst-case number of primitive operations executed as a function of the input size (^) We express this function with big-Oh notation Example: (^) We determine that algorithm arrayMax executes at most 7 n 1 primitive operations (^) We say that algorithm arrayMax “runs in O ( n ) time” Since constant factors and lower-order terms are eventually dropped anyhow, we can disregard them when counting primitive operations