Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Asymptotic Analysis - Advanced Programming - Lecture Slides, Slides of Computer Science

These are the Lecture Slides of Advanced Programming and its key important points are: Asymptotic Analysis, Ignoring Constants, Big-Oh Notation, Major Notations, Asymptotic Upper Bound, Big-Omega, Asymptotic Lower Bound, Big-Theta, Asymptotic Tight Bound, Constant Multiple

Typology: Slides

2012/2013

Uploaded on 03/20/2013

dharmanand
dharmanand 🇮🇳

3.3

(3)

61 documents

1 / 14

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Analysis of Algorithms Big-Oh
Docsity.com
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe

Partial preview of the text

Download Asymptotic Analysis - Advanced Programming - Lecture Slides and more Slides Computer Science in PDF only on Docsity!

Analysis of Algorithms

Big-Oh

Asymptotic Analysis

Ignoring constants in T ( n ) Analyzing T ( n ) as n "gets large"

Notationally, T(n) = O(n^3 )

Therunningtimegrows"roughlyontheorderofn^3 "

( )

(^3) log T n

n n n^2 n n n soit dominates

As growslarger, isMUCHlarger than , ,and ,

Example: T ( n ) = 13 n^3 + 42 n^2 + 2 n log n + 4 n

The big-oh ( O ) Notation

 The O symbol was introduced in 1927 to indicate relative growth of twoBig-Oh Defined

functions based on asymptotic behavior of the functions now used to classify functions and families of functions

T(n) = O(f(n)) if there are constants c and n0 such that T(n) < c*f(n) when nn

c*f(n)

T(n)

n 0 n

_cf(n)_* is an upper bound for T(n)

Big-Oh

  • Describes an upper bound for the running

time of an algorithm

Upper bounds for Insertion Sort running times:

  • worst case: O ( n^2 ) T (n) = c 1 * n^2 + c 2 * n + c (^3)
  • best case: O ( n ) T (n) = c 1 * n + c (^2)

Time Complexity

Big-Oh Properties

  • Fastest growing function dominates a sum
    • O (f( n )+g( n )) is O (max{f( n ), g( n )})
  • Product of upper bounds is upper bound for the product
    • If f is O(g) and h is O(r) then fh is O(gr)
  • f is O(g) is transitive
    • If f is O(g) and g is O(h) then f is O(h)
  • Hierarchy of functions
    • O (1), O (log n ), O (n 1/2^ ), O ( n log n ), O (n^2 ), O (2 n), O ( n !)

Some Big-Oh’s are not reasonable

  • Polynomial Time algorithms
    • An algorithm is said to be polynomial if it is O( n c^ ), c > 1
    • Polynomial algorithms are said to be reasonable
      • They solve problems in reasonable times!
      • Coefficients, constants or low-order terms are ignored e.g. if f(n) = 2n^2 then f(n) = O(n^2 )
  • Exponential Time algorithms
    • An algorithm is said to be exponential if it is O( r n^ ), r > 1
    • Exponential algorithms are said to be unreasonable

Classifying Algorithms based on Big-Oh

  • A function f(n) is said to be of at most logarithmic growth if f(n) = O(log n)
  • A function f(n) is said to be of at most quadratic growth if f(n) = O(n^2 )
  • A function f(n) is said to be of at most polynomial growth if f(n) = O(n k), for some natural number k > 1
  • A function f(n) is said to be of at most exponential growth if there is a constant c, such that f(n) = O(c n^ ), and c > 1
  • A function f(n) is said to be of at most factorial growth if f(n) = O(n!).
  • A function f(n) is said to have constant running time if the size of the input n has no effect on the running time of the algorithm (e.g., assignment of a value to a variable). The equation for this algorithm is f(n) = c
  • Other logarithmic classifications: f(n) = O(n log n) f(n) = O(log log n)

Rules for Calculating Big-Oh

Base of Logs ignored

loga n = O(logb n)

Power inside logs ignored

log(n^2 ) = O(log n)

Base and powers in exponents not ignored

3 n^ is not O(2n) 2 a (n )^ is not O(a n)

If T(x) is a polynomial of degree n, then T(x) = O(x n)

Big-Oh Examples (cont.)

  1. Suppose a program P is O(n^3 ), and a program

Q is O(3 n^ ), and that currently both can solve problems of size 50 in 1 hour. If the programs are run on another system that executes exactly 729 times as fast as the original system, what size problems will they be able to solve?

Big-Oh Examples (cont)

n^3 = 50 3 * 729 3 n^ = 3 50 * 729 n = n = log 3 (729 * 3 50 ) n = log 3 (729) + log 3 350 n = 50 * 9 n = 6 + log 3 3 50 n = 50 * 9 = 450 n = 6 + 50 = 56

  • Improvement: problem size increased by 9 times for n^3 algorithm but only a slight improvement in problem size (+6) for exponential algorithm.

(^3 503) * (^3729)