CS 303 Design and Analysis of Algorithms

Similar documents
Lecture 5: Sorting Part A

Comparisons. Θ(n 2 ) Θ(n) Sorting Revisited. So far we talked about two algorithms to sort an array of numbers. What is the advantage of merge sort?

Comparisons. Heaps. Heaps. Heaps. Sorting Revisited. Heaps. So far we talked about two algorithms to sort an array of numbers

Jana Kosecka. Linear Time Sorting, Median, Order Statistics. Many slides here are based on E. Demaine, D. Luebke slides

Data Structures and Algorithms Chapter 4

CPSC 311 Lecture Notes. Sorting and Order Statistics (Chapters 6-9)

Data Structures and Algorithms Week 4

Design and Analysis of Algorithms

Lecture 3. Recurrences / Heapsort

COMP Data Structures

EECS 2011M: Fundamentals of Data Structures

Lecture: Analysis of Algorithms (CS )

CS2223: Algorithms Sorting Algorithms, Heap Sort, Linear-time sort, Median and Order Statistics

403: Algorithms and Data Structures. Heaps. Fall 2016 UAlbany Computer Science. Some slides borrowed by David Luebke

Data Structures and Algorithms. Roberto Sebastiani

CSE 3101: Introduction to the Design and Analysis of Algorithms. Office hours: Wed 4-6 pm (CSEB 3043), or by appointment.

Sorting Shabsi Walfish NYU - Fundamental Algorithms Summer 2006

Sorting. Popular algorithms: Many algorithms for sorting in parallel also exist.

Sorting Algorithms. For special input, O(n) sorting is possible. Between O(n 2 ) and O(nlogn) E.g., input integer between O(n) and O(n)

Plotting run-time graphically. Plotting run-time graphically. CS241 Algorithmics - week 1 review. Prefix Averages - Algorithm #1

The Heap Data Structure

Deliverables. Quick Sort. Randomized Quick Sort. Median Order statistics. Heap Sort. External Merge Sort

Next. 1. Covered basics of a simple design technique (Divideand-conquer) 2. Next, more sorting algorithms.

1. Covered basics of a simple design technique (Divideand-conquer) 2. Next, more sorting algorithms.

Sorting. Bubble Sort. Pseudo Code for Bubble Sorting: Sorting is ordering a list of elements.

The divide and conquer strategy has three basic parts. For a given problem of size n,

A data structure and associated algorithms, NOT GARBAGE COLLECTION

4. Sorting and Order-Statistics

Heapsort. Algorithms.

4.4 Algorithm Design Technique: Randomization

Algorithms and Data Structures (INF1) Lecture 7/15 Hua Lu

Lecture 9: Sorting Algorithms

CS60020: Foundations of Algorithm Design and Machine Learning. Sourangshu Bhattacharya

Module 2: Classical Algorithm Design Techniques

CSci 231 Final Review

Simple Sorting Algorithms

How many leaves on the decision tree? There are n! leaves, because every permutation appears at least once.

CS 5321: Advanced Algorithms Sorting. Acknowledgement. Ali Ebnenasir Department of Computer Science Michigan Technological University

Heaps and Priority Queues

Heaps, Heapsort, Priority Queues

Analysis of Algorithms - Quicksort -

How much space does this routine use in the worst case for a given n? public static void use_space(int n) { int b; int [] A;

CS 360 Exam 1 Fall 2014 Name. 1. Answer the following questions about each code fragment below. [8 points]

Problem. Input: An array A = (A[1],..., A[n]) with length n. Output: a permutation A of A, that is sorted: A [i] A [j] for all. 1 i j n.

Introduction to Algorithms 3 rd edition

Divide and Conquer 4-0

Sorting is a problem for which we can prove a non-trivial lower bound.

CSE5311 Design and Analysis of Algorithms. Administrivia Introduction Review of Basics IMPORTANT

Chapter 6 Heap and Its Application

Properties of a heap (represented by an array A)

Solutions to Exam Data structures (X and NV)

7. Sorting I. 7.1 Simple Sorting. Problem. Algorithm: IsSorted(A) 1 i j n. Simple Sorting

Algorithms in Systems Engineering IE172. Midterm Review. Dr. Ted Ralphs

Divide and Conquer. Algorithm D-and-C(n: input size)

Computer Science & Engineering 423/823 Design and Analysis of Algorithms

Week 10. Sorting. 1 Binary heaps. 2 Heapification. 3 Building a heap 4 HEAP-SORT. 5 Priority queues 6 QUICK-SORT. 7 Analysing QUICK-SORT.

T(n) = expected time of algorithm over all inputs of size n. half the elements in A[1.. j 1] are less than A[ j ], and half the elements are greater.

17/05/2018. Outline. Outline. Divide and Conquer. Control Abstraction for Divide &Conquer. Outline. Module 2: Divide and Conquer

Week 10. Sorting. 1 Binary heaps. 2 Heapification. 3 Building a heap 4 HEAP-SORT. 5 Priority queues 6 QUICK-SORT. 7 Analysing QUICK-SORT.

CS 310 Advanced Data Structures and Algorithms

CS 3343 (Spring 2013) Exam 2

CS:3330 (22c:31) Algorithms

Chapter 8 Sort in Linear Time

Final Exam in Algorithms and Data Structures 1 (1DL210)

CS S-11 Sorting in Θ(nlgn) 1. Base Case: A list of length 1 or length 0 is already sorted. Recursive Case:

Quiz 1 Solutions. (a) f(n) = n g(n) = log n Circle all that apply: f = O(g) f = Θ(g) f = Ω(g)

Quiz 1 Solutions. Asymptotic growth [10 points] For each pair of functions f(n) and g(n) given below:

DIVIDE & CONQUER. Problem of size n. Solution to sub problem 1

Heapsort. Heap data structure

Introduction to Computers and Programming. Today

CSE 332 Spring 2014: Midterm Exam (closed book, closed notes, no calculators)

COT 6405 Introduction to Theory of Algorithms

1. Answer the following questions about each code fragment below. [8 points]

CS 360 Exam 1 Fall 2014 Solution. 1. Answer the following questions about each code fragment below. [8 points]

1. [1 pt] What is the solution to the recurrence T(n) = 2T(n-1) + 1, T(1) = 1

Algorithms Lab 3. (a) What is the minimum number of elements in the heap, as a function of h?

Introduction. e.g., the item could be an entire block of information about a student, while the search key might be only the student's name

1. Answer the following questions about each code fragment below. [8 points]

Divide & Conquer. 2. Conquer the sub-problems by solving them recursively. 1. Divide the problem into number of sub-problems

CISC 621 Algorithms, Midterm exam March 24, 2016

Lecture 2: Divide&Conquer Paradigm, Merge sort and Quicksort

O(n): printing a list of n items to the screen, looking at each item once.

ECE250: Algorithms and Data Structures Midterm Review

Tirgul 4. Order statistics. Minimum & Maximum. Order Statistics. Heaps. minimum/maximum Selection. Overview Heapify Build-Heap

II (Sorting and) Order Statistics

Module 1: Asymptotic Time Complexity and Intro to Abstract Data Types

SAMPLE OF THE STUDY MATERIAL PART OF CHAPTER 6. Sorting Algorithms

CSE 332 Autumn 2013: Midterm Exam (closed book, closed notes, no calculators)

The divide-and-conquer paradigm involves three steps at each level of the recursion: Divide the problem into a number of subproblems.

Algorithms and Data Structures for Mathematicians

11/8/2016. Chapter 7 Sorting. Introduction. Introduction. Sorting Algorithms. Sorting. Sorting

8. Sorting II. 8.1 Heapsort. Heapsort. [Max-]Heap 6. Heapsort, Quicksort, Mergesort. Binary tree with the following properties Wurzel

Recurrences and Divide-and-Conquer

CS302 Topic: Algorithm Analysis. Thursday, Sept. 22, 2005

Module 3: Sorting and Randomized Algorithms

QuickSort and Others. Data Structures and Algorithms Andrei Bulatov

Module 3: Sorting and Randomized Algorithms. Selection vs. Sorting. Crucial Subroutines. CS Data Structures and Data Management

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Merge Sort & Quick Sort

Searching, Sorting. part 1

Quiz 1 Practice Problems

Transcription:

Mid-term CS 303 Design and Analysis of Algorithms Review For Midterm Dong Xu (Based on class note of David Luebke) 12:55-1:55pm, Friday, March 19 Close book Bring your calculator 30% of your final score Office hours during March 15-19 Dong: 11am-noon and 3pm-4pm, Wed, Mar 17 (no office hour on Mar 19 due to travel) Ashwin: additional office hours at 9:30-noon, Fri, Mar 19. 1 3/17/2004 2 3/17/2004 Asymptotic notation Solving recurrences Sorting algorithms Insertion sort Merge sort Heap sort Quick sort Counting sort Radix sort Bucket sort Review Of Topics Review of Topics Structures for dynamic sets Priority queues Hash tables 3 3/17/2004 4 3/17/2004 Proof By Induction Claim:S(n) is true for all n >= k (e.g., k = 0) Basis: Show formula is true when n = k Inductive hypothesis: Assume formula is true for an arbitrary n Step: Show that formula is then true for n+1 Review: Analyzing Algorithms We are interested in asymptotic analysis: Behavior of algorithms as problem size gets large Constants, low-order terms don t matter 5 3/17/2004 6 3/17/2004 1

Insertion Sort Analyzing Insertion Sort Statement Effort InsertionSort(A, n) for i = 2 to n c 1 n key = A[i] c 2 (n-1) j = i - 1; c 3 (n-1) while (j > 0) and (A[j] > key) c 4 T A[j+1] = A[j] c 5 (T-(n-1)) j = j - 1 c 6 (T-(n-1)) 0 A[j+1] = key c 7 (n-1) 0 T = t 2 + t 3 + + t n where t i is number of while expression evaluations for the i th for loop iteration 7 3/17/2004 T(n) = c 1 n + c 2 (n-1) + c 3 (n-1) + c 4 T + c 5 (T - (n-1)) + c 6 (T - (n-1)) + c 7 (n-1) = c 8 T + c 9 n + c 10 What can T be? Best case -- inner loop body never executed t i = 1 T(n) is a linear function Worst case -- inner loop body executed for all previous elements t i = i T(n) is a quadratic function If T is a quadratic function, which terms in the above equation matter? 8 3/17/2004 Upper Bound Notation We say InsertionSort s run time is O(n 2 ) Properly we should say run time is in O(n 2 ) Read O as Big-O (you ll also hear it as order ) In general a function f(n) is O(g(n)) if there exist positive constants c and n 0 such that f(n) c g(n) for all n n 0 Formally O(g(n)) = f(n): positive constants c and n 0 such that f(n) c g(n) n n 0 Big O Fact A polynomial of degree k is O(n k ) Proof: Suppose f(n) = b k n k + b k-1 n k-1 + + b 1 n + b 0 Let a i = b i f(n) a k n k + a k-1 n k-1 + + a 1 n + a 0 n k i n ai k n n k a i cn k 9 3/17/2004 10 3/17/2004 Lower Bound Notation We say InsertionSort s run time is Ω(n) In general a function f(n) is Ω(g(n)) if positive constants c and n 0 such that 0 c g(n) f(n) n n 0 Asymptotic Tight Bound A function f(n) is Θ(g(n)) if positive constants c 1, c 2, and n 0 such that c 1 g(n) f(n) c 2 g(n) n n 0 11 3/17/2004 12 3/17/2004 2

Other Asymptotic Notations Notation Summary A function f(n) is o(g(n)) if positive constants c and n 0 such that f(n) < c g(n) n n 0 A function f(n) is ω(g(n)) if positive constants c and n 0 such that c g(n) < f(n) n n 0 A=O(B) A B A=o(B) A < B A=Θ(B) A=B A=Ω(B) A B A=ω(B) A > B o() is like < O() is like ω() is like > Ω() is like Θ() is like = 13 3/17/2004 14 3/17/2004 Review: Recurrences Review: Solving Recurrences Recurrence: an equation that describes a function in terms of its value on smaller functions 0 n = 0 s( n) = n + s( n 1) n > 0 c T ( n) = n at + cn b n = 1 n > 1 15 3/17/2004 Substitution method Recursion tree method Master method 16 3/17/2004 Review: Substitution Method Substitution Method: Guess the form of the answer, then use induction to find the constants and show that solution works Examples: T(n) = 2T(n/2) + Θ(n) T(n) = Θ(n lg n) We can show that this holds by induction Substitution Method Our goal: show that T(n) = 2T( n/2 ) + n = O(n lg n) Thus, we need to show that T(n) c n lg n with an appropriate choice of c Inductive hypothesis: assume T( n/2 ) c n/2 lg n/2 Substitute back into recurrence to show that T(n) c n lg n follows, when c 1 17 3/17/2004 18 3/17/2004 3

Review: Recursion Tree Recursion tree method: Expand the recurrence into a tree form Work some algebra to express as a summation Evaluate the summation Review: The Master Theorem Given: a divide and conquer algorithm An algorithm that divides the problem of size n into a subproblems, each of size n/b Let the cost of each stage (i.e., the work to divide the problem + combine solved subproblems) be described by the function f(n) Then, the Master Theorem gives us a cookbook for the algorithm s running time: 19 3/17/2004 20 3/17/2004 Review: The Master Theorem if T(n) = at(n/b) + f(n) then Θ T ( n) = Θ Θ logb a ( n ) logb a ( n log n) ( f ( n) ) f ( n) = O n f ( n) = Θ f ( n) = Ω logb a ε ( ) logb a ( n ) logb a+ε ( n ) af ( n / b) < cf ( n) AND for large ε > 0 c < 1 n 21 3/17/2004 Review: Merge Sort MergeSort(A, left, right) if (left < right) mid = floor((left + right) / 2); MergeSort(A, left, mid); MergeSort(A, mid+1, right); Merge(A, left, mid, right); // Merge() takes two sorted subarrays of A and // merges them into a single sorted subarray of A. // Merge()takes O(n) time, n = length of A 22 3/17/2004 Review: Analysis of Merge Sort Statement MergeSort(A, left, right) if (left < right) mid = floor((left + right) / 2); MergeSort(A, left, mid); MergeSort(A, mid+1, right); Merge(A, left, mid, right); So T(n) = Θ(1) when n = 1, and 2T(n/2) + Θ(n) when n > 1 Effort T(n) Θ(1) Θ(1) T(n/2) T(n/2) Θ(n) Solving this recurrence (how?) gives T(n) = n lg n Review: Heaps A heap is a complete binary tree, usually represented as an array: 16 4 10 14 7 9 3 2 8 1 A = 16 14 10 8 7 9 3 2 4 1 23 3/17/2004 24 3/17/2004 4

Review: Heaps To represent a heap as an array: Parent(i) return i/2 ; Left(i) return 2*i; right(i) return 2*i + 1; Review: The Heap Property Heaps also satisfy the heap property: A[Parent(i)] A[i] for all nodes i > 1 In other words, the value of a node is at most the value of its parent The largest value is thus stored at the root (A[1]) Because the heap is a binary tree, the height of any node is at most Θ(lg n) 25 3/17/2004 26 3/17/2004 Review: Heapify() Heapify(): maintain the heap property Given: a node i in the heap with children l and r Given: two subtrees rooted at l and r, assumed to be heaps Action: let the value of the parent node float down so subtree at i satisfies the heap property If A[i] < A[l] or A[i] < A[r], swap A[i] with the largest of A[l] and A[r] Recurse on that subtree Running time: O(h), h = height of heap = O(lg n) Review: BuildHeap() BuildHeap(): build heap bottom-up by running Heapify() on successive subarrays Walk backwards through the array from n/2 to 1, calling Heapify() on each node. Order of processing guarantees that the children of node i are heaps when i is processed Easy to show that running time is O(n lg n) Can be shown to be O(n) Key observation: most subheaps are small 27 3/17/2004 28 3/17/2004 Review: Heapsort() Heapsort(): an in-place sorting algorithm: Maximum element is at A[1] Discard by swapping with element at A[n] Decrement heap_size[a] A[n] now contains correct value Restore heap property at A[1] by calling Heapify() Repeat, always swapping A[1] for A[heap_size(A)] Running time: O(n lg n) BuildHeap: O(n), Heapify: n * O(lg n) 29 3/17/2004 Review: Priority Queues The heap data structure is often used for implementing priority queues A data structure for maintaining a set S of elements, each with an associated value or key Supports the operations Insert(), Maximum(), and ExtractMax() Commonly used for scheduling, event simulation 30 3/17/2004 5

Priority Queue Operations Insert(S, x) inserts the element x into set S Maximum(S) returns the element of S with the maximum key ExtractMax(S) removes and returns the element of S with the maximum key Implementing Priority Queues HeapInsert(A, key) // what s running time? heap_size[a] ++; i = heap_size[a]; while (i > 1 AND A[Parent(i)] < key) A[i] = A[Parent(i)]; i = Parent(i); A[i] = key; 31 3/17/2004 32 3/17/2004 Implementing Priority Queues HeapMaximum(A) // This one is really tricky: return A[i]; Implementing Priority Queues HeapExtractMax(A) if (heap_size[a] < 1) error; max = A[1]; A[1] = A[heap_size[A]] heap_size[a] --; Heapify(A, 1); return max; 33 3/17/2004 34 3/17/2004 Review: Quicksort Quicksort pros: Sorts in place Sorts O(n lg n) in the average case Very efficient in practice Quicksort cons: Sorts O(n 2 ) in the worst case Naïve implementation: worst-case = sorted Even picking a different pivot, some particular input will take O(n 2 ) time Review: Quicksort Another divide-and-conquer algorithm The array A[p..r] is partitioned into two nonempty subarrays A[p..q] and A[q+1..r] Invariant: All elements in A[p..q] are less than all elements in A[q+1..r] The subarrays are recursively quicksorted No combining step: two subarrays form an already-sorted array 35 3/17/2004 36 3/17/2004 6

Review: Quicksort Code Review: Partition Code Quicksort(A, p, r) if (p < r) q = Partition(A, p, r); Quicksort(A, p, q); Quicksort(A, q+1, r); Partition(A, p, r) x = A[p]; i = p - 1; j = r + 1; while (TRUE) repeat j--; until A[j] <= x; repeat i++; until A[i] >= x; if (i < j) Swap(A, i, j); else return j; partition() runs in O(n) time 37 3/17/2004 38 3/17/2004 Review: Analyzing Quicksort What will be the worst case for the algorithm? Partition is always unbalanced What will be the best case for the algorithm? Partition is perfectly balanced Which is more likely? The latter, by far, except... Will any particular input elicit the worst case? Yes: Already-sorted input Review: Analyzing Quicksort In the worst case: T(1) = Θ(1) T(n) = T(n - 1) + Θ(n) Works out to T(n) = Θ(n 2 ) 39 3/17/2004 40 3/17/2004 Review: Analyzing Quicksort In the best case: T(n) = 2T(n/2) + Θ(n) Works out to T(n) = Θ(n lg n) Review: Analyzing Quicksort Average case works out to T(n) = Θ(n lg n) Key idea: A limited number of unbalanced Partition() is OK 41 3/17/2004 42 3/17/2004 7

Review: Improving Quicksort The real liability of quicksort is that it runs in O(n 2 ) on already-sorted input Book discusses two solutions: Randomize the input array, OR Pick a random pivot element How do these solve the problem? By insuring that no particular input can be chosen to make quicksort run in O(n 2 ) time Sorting Summary Insertion sort: Easy to code Fast on small inputs (less than ~50 elements) Fast on nearly-sorted inputs O(n 2 ) worst case O(n 2 ) average (equally-likely inputs) case O(n 2 ) reverse-sorted case 43 3/17/2004 44 3/17/2004 Sorting Summary Merge sort: Divide-and-conquer: Split array in half Recursively sort subarrays Linear-time merge step O(n lg n) worst case Doesn t sort in place Heap sort: Sorting Summary Uses the very useful heap data structure Complete binary tree Heap property: parent key > children s keys O(n lg n) worst case Sorts in place Fair amount of shuffling memory around 45 3/17/2004 46 3/17/2004 Quick sort: Sorting Summary Divide-and-conquer: Partition array into two subarrays, recursively sort All of first subarray < all of second subarray No merge step needed! O(n lg n) average case Fast in practice O(n 2 ) worst case Naïve implementation: worst case on sorted input Address this with randomized quicksort Review: Comparison Sorts Comparison sorts: O(n lg n) at best Model sort with decision tree Path down tree = execution trace of algorithm Leaves of tree = possible permutations of input Tree must have n! leaves, so O(n lg n) height 47 3/17/2004 48 3/17/2004 8

Review: Comparison Sorts Review: Counting Sort Sorting Time Space Stability In place Worst Best Average Bubble Sort Θ(n 2 ) Θ(n 2 ) Θ(n 2 ) Θ(n) Yes Yes Insertion Sort O(n) Θ(n 2 ) Θ(n 2 ) Θ(n) Yes Yes Selection Sort O(n 2 ) O(n 2 ) O(n 2 ) Θ(n) Yes Yes Merge Sort O(n lg(n)) O(n lg(n)) O(n lg(n)) Θ(n lg(n)) Yes No Heap Sort Θ(n lg(n)) Ω(n lg(n)) O(n lg(n)) Θ(n) No Yes Quick Sort O(n lg(n)) Θ(n 2 ) O(n lg(n) ) Θ(n) Yes Yes Counting sort: Assumption: input is in the range 1..k Basic idea: Count number of elements k each element i Use that number to place i in position k of sorted array No comparisons! Runs in time O(n + k) Stable sort Does not sort in place: O(n) array to hold sorted output O(k) array for scratch storage 49 3/17/2004 50 3/17/2004 Review: Counting Sort Review: Radix Sort 1 CountingSort(A, B, k) 2 for i=1 to k 3 C[i]= 0; 4 for j=1 to n 5 C[A[j]] += 1; 6 for i=2 to k 7 C[i] = C[i] + C[i-1]; 8 for j=n downto 1 9 B[C[A[j]]] = A[j]; 10 C[A[j]] -= 1; Radix sort: Assumption: input has d digits ranging from 0 to k Basic idea: Sort elements by digit starting with least significant Use a stable sort (like counting sort) for each stage Each pass over n numbers with d digits takes time O(n+k), so total time O(dn+dk) When d is constant and k=o(n), takes O(n) time Fast! Stable! Simple! Doesn t sort in place 51 3/17/2004 52 3/17/2004 Review: Hashing Tables Motivation: symbol tables A compiler uses a symbol table to relate symbols to associated data Symbols: variable names, procedure names, etc. Associated data: memory location, call graph, etc. For a symbol table (also called a dictionary), we care about search, insertion, and deletion We typically don t care about sorted order Review: Hash Tables More formally: Given a table T and a record x, with key (= symbol) and satellite data, we need to support: Insert (T, x) Delete (T, x) Search(T, x) Don t care about sorting the records Hash tables support all the above in O(1) expected time 53 3/17/2004 54 3/17/2004 9

Review: Direct Addressing Review: Hash Functions Suppose: The range of keys is 0..m-1 Keys are distinct The idea: Use key itself as the address into the table Set up an array T[0..m-1] in which T[i] = x if x T and key[x] = i T[i] = NULL otherwise This is called a direct-address table Next problem: collision U (universe of keys) k 1 K (actual keys) k 2 k 4 k 5 k3 T 0 h(k 1 ) h(k 4 ) h(k 2 ) = h(k 5 ) h(k 3 ) m -1 55 3/17/2004 56 3/17/2004 Review: Resolving Collisions How can we solve the problem of collisions? Open addressing To insert: if slot is full, try another slot, and another, until an open slot is found (probing) To search, follow same sequence of probes as would be used when inserting the element Chaining Keep linked list of elements in slots Upon collision, just add new element to list 57 3/17/2004 Review: Chaining Chaining puts elements that hash to the same slot in a linked list: U (universe of keys) k 1 K (actual keys) k 4 k 5 k 7 k 2 k k 3 k 8 6 T k 1 k 4 k 5 k 2 k 3 k 8 k 6 k 7 58 3/17/2004 Review: Analysis Of Hash Tables Simple uniform hashing: each key in table is equally likely to be hashed to any slot Load factor α = n/m = average # keys per slot Average cost of unsuccessful search = O(1+α) Successful search: O(1+ α/2) = O(1+ α) If n is proportional to m, α = O(1) So the cost of searching = O(1) if we size our table appropriately Review: Choosing A Hash Function Choosing the hash function well is crucial Bad hash function puts all elements in same slot A good hash function: Should distribute keys uniformly into slots Should not depend on patterns in the data Methods: Division method Multiplication method 59 3/17/2004 60 3/17/2004 10

Review: The Division Method h(k) = k mod m In words: hash k into a table with m slots using the slot given by the remainder of k divided by m Elements with adjacent keys hashed to different slots: good If keys bear relation to m: bad Upshot: pick table size m = prime number not too close to a power of 2 (or 10) Review: The Multiplication Method For a constant A, 0 < A < 1: h(k) = m (ka - ka ) Fractional part of ka Upshot: Choose m = 2 P Choose A not too close to 0 or 1 Knuth: Good choice for A = ( 5-1)/2 61 3/17/2004 62 3/17/2004 11