1 / 34

CS420 lecture two Fundamentals

CS420 lecture two Fundamentals. wim bohm, cs , CSU. Asymptotics. Asymptotics show the relative growth of functions by comparing them to other functions. There are different notations: f(x ) ~ g(x ): f(x ) = o(g(x )): where f and g are positive functions.

gaille
Download Presentation

CS420 lecture two Fundamentals

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CS420 lecture twoFundamentals wim bohm, cs, CSU

  2. Asymptotics • Asymptotics show the relative growth of functions by comparing them to other functions. There are different notations: f(x) ~ g(x): f(x) = o(g(x)): where f and g are positive functions

  3. Big O, big omega, big theta • f(x) = O(g(x)) iff there are positive integers c and n0 such that f(x) < c.g(x) for all x > n0 • f(x) = Ω(g(x)) iff there are positive integers c and n0 such that f(x) > c.g(x) for all x > n0 • f(x) = Θ(g(x)) iff f(x) = O(g(x)) and f(x) = Ω(g(x))

  4. Big O etc. • Big O used in upper bounds, ie the worst or average case complexity of an algorithm. Big theta is a tight bound, eg stronger than upper bound. • Big omega used in lower bounds, ie the complexity of a problem. (So we were sloppy in the last lecture.)

  5. Closed / open problems • A closed problem has a lower bound Ω(f(x)) and an algorithm with upper bound O(f(x)) • eg searching, sorting • what about matrix multiply? • An open problem has lower bound < upper bound

  6. lower bounds • An easy lower bound on a problem is the size of the output it needs to produce, or the number of inputs it has to access • Generate all permutations of size n, lower bound? • Towers of Hanoi, lower bound? • Sum n input integers, lower bound? but... sum integers 1 to n?

  7. growth rates • f(n) = O(1) constant Scalar operations (+,-,*,/) when input size not measured in #bits Straight line code of simple assignments (x= simple expresssion) and conditionals with simple sub-expressions Function calls, discuss

  8. f(n) = log(n) • definition: bx= a x = logba, eg 23=8, log28=3 • log(x*y) = log x + log y because bx . by = bx+y • log(x/y) = log x – log y • log xa= a log x • log x is a 1to1 monotonically growing function log x = log yx=y

  9. more log stuff • logax = logbx / logba because

  10. and more log stuff

  11. log n and algorithms • In algorithm analysis we often use log n when we should use floor(log(n)). Is that OK? • When in each step of an algorithm we halve the size of the problem (or divide it by k) then it takes log nsteps to get to the base case • Notice that logb1n = O(logb2n) for any b1 and b2, so the base does not matter in O analysis Does that work for exponents too? Is 2n = O(3n) ? Is 3n = O(2n)?

  12. log n and algorithms • In algorithm analysis we often use log n when we should use floor(log(n)). That's OK floor(log(n)) = O(log(n)) • When in each step of an algorithm we halve the size of the problem (or divide it by k) then it takes log nsteps to get to the base case • Notice that logb1n = O(logb2n) for any b1 and b2, so the base does not matter in O analysis Does that work for exponents too? Is 2n = O(3n) ? Is 3n = O(2n)?

  13. log n and algorithms Algorithms with O(logn) complexity are often of a simple divide and conquer variety (base, solve-direct, and split are O(1)): DivCo(P){ if base(P) solve-direct(P) else split(P)  P1 ... Pn DivCo(Pi) } eg Binary Search

  14. log n and algorithms Algorithms with O(logn) complexity are often of a simple divide and conquer variety (base, solve-direct, and split are O(1)): DivCo(P){ if base(P) solve-direct(P) else split(P)  P1 ... Pn DivCo(Pi) } Solve f(n) = f(n/2) + 1 f(1)=1 by repeated substitution

  15. General Divide and Conquer DivCo(P) { if base(P) solve-direct(P) else split(P)  P1 ... Pn combine(DivCo(P1) ... DivCo(Pn)) } Depending on the costs of base, solve direct, split and combine we get different complexities (later).

  16. f(n)=O(n) Linear complexity • eg Linear Search in an unsorted list • also: Polynomial evaluation A(x) = anxn+ an-1xn-1+...+ a1x+a0 (an!=0) Evaluate A(x0) • How not to do it: an * exp(x,n)+ an-1*exp(x,n-1)+...+ a1*x+a0 why not?

  17. How to do it: Horner's rule y=a[n] for (i=n-1;i>=0;i--) y = y *x + a[i]

  18. Horner complexity • Lower bound: Ω(n) because we need to access each a[i] at least once • Upper bound: O(n) • Closed problem • But what if A(x) = xn

  19. A(x)=xn • Recurrence: x2n=xn.xn x2n+1=x.x2n y=1; while(n!=0){ if(odd(n)) y=y*x; x=x*x; n = n/2; } Complexity?

  20. O(nlog(n)) • Often resulting from divide and conquer algorithms where split & combine are O(n) and we divide in nearly equal halves. mergesort(A){ if size(A) <= 1 return A else return merge(mergesort(lefthalf(A)), mergesort(righthalf(A)))

  21. {7,3,2,9} {1,6,4,5} {7,3} {2,9} {1,6} {4,5} {7} {3} {2} {9} {1} {6} {4} {5} Merge Sort - Split {7,3,2,9,1,6,4,5}

  22. {2,3,7,9} {1,4,5,6} {3,7} {2,9} {1,6} {4,5} {7} {3} {2} {9} {1} {6} {4} {5} Merge Sort - Merge {1,2,3,4,5,6,7,9}

  23. Merge sort complexity • Time • Total cost of all splits? • Cost of each merge level in the tree? • How many merge levels in the tree? • Space • Is extra space needed? (outside the array A) • If so, how much?

  24. Series

  25. Geometric Series

  26. Harmonic series why?

  27. Products why?

  28. Using integrals to bound sums If f is a monotonically growing function, then:

  29. f(x1)+f(x2)+f(x3)+f(x4)<=f(x2)+f(x3)+f(x4)+f(x5) x1 x2 x3 x4 x5

  30. some formula manipulation....

  31. Example of use

  32. using the integral bounds

  33. concluding...

  34. Sorting lower bound • We will use in proving a lower bound on sorting

More Related