280 likes | 440 Views
Evaluating Code Efficiency: Algorithm Analysis. What is the task to be accomplished? Calculate the average grade for a given student Finding the nth element in a sequence What are the time / space requirements ?. Algorithm Analysis.
E N D
Evaluating Code Efficiency: Algorithm Analysis • What is the task to be accomplished? • Calculate the average grade for a given student • Finding the nth element in a sequence • What are the time / space requirements ?
Algorithm Analysis • Algorithm: Finite set of instructions that, if followed, accomplishes a particular task. Algorithm Analysis: • Space complexity • How much space is required • Time complexity • How much time does it take to run the algorithm • We worry about Time Complexity! • We deal with estimates! • Makes life a lot easier
Time Complexity • Often more important than space complexity • More and more space available • time is still a problem • researchers estimate that the computation of various transformations for 1 single DNA chain for one single protein on a1 THZ computer would take about 1 year to run to completion • Algorithms running time is an important issue
Running time • We can evaluate for the worst case, average case, or best case. • Suppose the program includes an if statement that may execute or not: • variable running time • We assume that the worst case will run! • Typically algorithms are measured by their worst case (big-Oh)
Big Oh Running Time • Evaluates the algorithms independent of the hardware and software • Determine the running time of an algorithm using generalities
Algorithm Analysis • Analyze in terms of Primitive Operations: • e.g., • An addition = 1 operation • Assignment = 1 operation • Calling a function or returning from a function = 1 operation • Index in an array = 1 operation • Comparison = 1 operation • Analysis: count the number of primitive operations executed by the algorithm • remember, we count the worst case scenario!
What does this function do? • def ourFunc(ls) • x= ls[0] • Y = 0 • For ct in range(1,len(ls)): • if x < ls[ct]: • x = ls[ct] • y = ct • Return(x) How many operations ? Line 1: 1 counts Line 2: 2 count Line 3: 1 count; Line 4: 2*(list length – 1) Line 5: 2 * (list length – 1) Line 6: 2 * (list length – 1) Line 7: 1 * (list length – 1) Line 8: 1 Total: 1 + 2 + 1 + 7*(list length – 1) + 1 = 7*arraylength- 2
Algorithm Analysis • We simplify the analysis bygetting rid of unneeded information • We drop constants when expressing big Oh. • E.g., if we have a program that runs in 3n +2 time, we’d say that the function runs in O(n). • We drop lower order terms when expressing big Oh • E.g., if we have a function that runs in polynomial time (4n4 + 300n3 +7n + 2), we can say that it runs in O(n4). • Why? Because after a certain point the lower order is subsumed by the higher order. • e.g., if n is 500, it becomes so much more of a time concern than the 300n3 that we really don’t have to worry about it. Same with 7n. • Hence we get O(n4) for this polynomial.
Back to Sorting def SelectionSort(ls): for i in range(len(ls)): s=ls[i] si = i for j in range(i, len(ls)): if (ls[j]<s): s=ls[j] si=j ls[si]=ls[i] ls[i]=s return ls a=[3,5,2,7,1] print (a) print ("=>", SelectionSort(a)) Analysis? a = [1,2,3,4,5] a = [5,4,3,2,1]
def insertionsort(ls): for i in range(0,len(ls)): x = ls[i] j = i-1 while (j >= 0) and (ls[j] > x): ls[j+1] = ls[j] j = j-1 ls[j+1] = x return(ls) ls=[3,1,2,6,4,7] insertionsort(ls) print(ls) Analysis? a = [1,2,3,4,5] a = [5,4,3,2,1]
def bubblesort(list2): i = 0; swap_test = False while ((i < len(list2)-1) and (swap_test == False)): swap_test = True for j in range(0, len(list2) - i - 1): if list2[j] > list2[j + 1]: list2[j], list2[j + 1] = list2[j + 1], list2[j] swap_test = False i += 1 return(list2) ls = [3,2,4,7,1] print(bubblesort(ls)) Analysis? a = [1,2,3,4,5] a = [5,4,3,2,1]
def f(x): for j in range(x): for k in range(x): print(j*k) • Running time? def f(x): for j in range(0,x): for k in range(0,x): for l in range(0,x): print largest(j,k,l) • Running time?
B Search ls = [1,3,4,6,8,10,11,15,17,22,26,28,34,46,52,57] index: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 How long to find 11? How about 57? How about 1? Log n: for some n that is between 2x-1 and 2x in length, it will take at most x steps to accomplish the task (above: 16 items, 24, it takes at most 4 attempts to find the number.
Search Find where num x is in a sorted list: list = [1,3,6,8,9,11,13,16,19,23,26,27,34,37,42,45] Find 3? Find 23? Find 45?
Better Search: If list is sorted, look at middle value in list. If num == middle value, we’ve found our num If num>middle value, look at top half of list If num < middle value, look at bottom half of list We repeat this until we either find the value or we know the value isn’t in the list.
[1,3,6,8,9,11,13,16,19,23,26,27,34,37,42,45] • Looking for 9: • Look at ls[8] (19) • 9 is less than 19 • [1,3,6,8,9,11,13,16,19,23,26,27,34,37,42,45] • Look at ls[3] (8) • 9 is greater than 8 • [1,3,6,8,9,11,13,16,19,23,26,27,34,37,42,45] • Look at ls[6] (11) • 9 is less than 11 • [1,3,6,8,9,11,13,16,19,23,26,27,34,37,42,45] • Look at ls[4] (9) • 9 found! • Return 4 • How many looks at most?
def binseait(ls,num): x = -1 y = len(ls)/2 first,last = 0,len(ls)-1 while (first <= last): print ('first is '+str(first)+'last is '+str(last)) if ls[y] == num: x = y first = last+1 elifls[y] < num: first = y+1 else: last = y-1 y = first + ((last-first)/2) return(x) print(binseait(list,16)) def binsearec(ls,num,ind): print ('ls is now: ' + str(ls)) if len(ls) == 0: return(-1) elif (len(ls) == 1) and (ls[0]== num): return (ind) elif (len(ls) == 1): return (-1) x = len(ls)/2 if ls[x] == num: return(ind + x) elifls[x] > num: return binsearec(ls[0:x-1],num,ind) else: return(binsearec(ls[x+1:len(ls)],num,ind+x+1)) list = [1,3,6,8,9,11,13,16,19,23,26,27,34,37,42,45] print(binsearec(list,45,0))
Sorting: Can we do better? • merge sort idea : • combining two sets of ordered data • Goal: Combine the two sorted sequences in one larger sorted sequence • Merge sort starts small and merges longer and longer sequences
Merge Algorithm (Two Sequences) Merging two sequences: • Access the first item from both sequences • While neither sequence is finished • Compare the current items of both • Copy smaller current item to the output • Access next item from that input sequence • Copy any remaining from first sequence to output • Copy any remaining from second to output
Example (Merge part): 3,5,8,15 4,9,12,20,21 • Compare 3 and 4 • 3goes into new sorted list (3) and we increment the first array: 2. compare 5 and 4 • 4goes into the new list (3, 4) and we increment the second array 3. compare 5 and 9 – 5 goes into the list (3, 4, 5), we increment first array 4. compare 8 and 9 – 8 goes into the array (3,4,5,8) and we increment the first array 5. compare 15 and 9 – 9 goes into the array (3,4,5,8,9) and we increment the second array 6. compare 15 and 12 – 12 goes into the array (3,4,5,8,9,12) and we increment the second array 7. compare 15 and 20 – 15 goes into the array (3,4,5,8,9,12,15) 8. we reached the end of the first array, so we copy the rest of the second array into the list: (3,4,5,8,9,12,15,20,21)
Using Merge to Sort • So far, we’ve merged 2 already sorted lists. • Using merge to sort entire list: • Unordered list: divide into 2 lists • Can we merge these lists? • No – these lists are also unordered. • Divide each of these lists into 2 lists • Continue to divide until each list contains one element • Is a one-element list ordered? • Yes! • Now we can start merging lists.
Merge Sort Algorithm Overview: • Split array into two halves • Sort the left half (recursively is probably easiest) • Sort the right half (recursively is probably easiest) • Merge the two sorted halves
Merge Sort pseudo-Algorithm if totalSize 1 return #no sorting required halfSize = totalSize / 2 Create a leftsidelist Copy elements 0 .. halfSize – 1 to leftsidelist Create a rightsidelist Copy elements halfSize .. totalSize – 1 to rightsidelist Divide leftsidelistrecursively Dividerightsidelistrecursively Merge leftsidelistand rightsidelistinto newlist and return newlist
MergeSort in Python: def mergesort(lista): if len(lista) <=1: return lista else: middle = len(lista) / 2 #keep dividing left side into 2 #lists half the size of the original left = mergesort(lista[:middle]) #keep dividing right side into 2 #lists half the size of the original right = mergesort(lista[middle:]) #now we have a bunch of small lists #that we need to keep merging #back into larger, sorted lists return merge(left, right) listb = [8,2,3,1,9,5,7,0,4,6] print(mergesort(listb)) def merge(left, right): result = [] i ,j = 0, 0 while i < len(left) and j < len(right): if left[i] <= right[j]: result.append(left[i]) i += 1 else: result.append(right[j]) j += 1 result += left[i:] result += right[j:] return result
Analysis of Merge • Two input lists, total length n elements • Must move each element to the output • Merge time is O(n) • Must store both input and outputlists • An array cannot be merged in place • Additional space needed: O(n)
Merge Sort Analysis • Splitting/copying n elements to sublists: O(n) • Merging back into original list: O(n) • Recursive calls: 2, each of size n/2 • Their total non-recursive work: O(n) • Next level: 4 calls, each of size n/4 • Non-recursive work again O(n) • Size sequence: n, n/2, n/4, ..., 1 • Number of levels = log n • Total work: O(n log n) Sorted lists? Reverse order lists?