CS Data Structures and Algorithm Analysis

Similar documents
MA/CSSE 473 Day 12. Questions? Insertion sort analysis Depth first Search Breadth first Search. (Introduce permutation and subset generation)

CSC Design and Analysis of Algorithms. Lecture 5. Decrease and Conquer Algorithm Design Technique. Decrease-and-Conquer

CS Elementary Graph Algorithms & Transform-and-Conquer

CSC Design and Analysis of Algorithms. Lecture 5. Decrease and Conquer Algorithm Design Technique. Decrease-and-Conquer

17/05/2018. Outline. Outline. Divide and Conquer. Control Abstraction for Divide &Conquer. Outline. Module 2: Divide and Conquer

Chapter 5. Decrease-and-Conquer. Copyright 2007 Pearson Addison-Wesley. All rights reserved.

Lecture 7. Transform-and-Conquer

CPS 616 TRANSFORM-AND-CONQUER 7-1

CSC Design and Analysis of Algorithms. Lecture 4 Brute Force, Exhaustive Search, Graph Traversal Algorithms. Brute-Force Approach

Computer Science 431 Algorithms The College of Saint Rose Spring Topic Notes: Decrease and Conquer

CSC Design and Analysis of Algorithms

CSC Design and Analysis of Algorithms. Lecture 7. Transform and Conquer I Algorithm Design Technique. Transform and Conquer

Brute Force: Selection Sort

CS61BL. Lecture 5: Graphs Sorting

Dr. Amotz Bar-Noy s Compendium of Algorithms Problems. Problems, Hints, and Solutions

2. Instance simplification Solve a problem s instance by transforming it into another simpler/easier instance of the same problem

Analysis of Algorithms. Unit 4 - Analysis of well known Algorithms

4. Apply insertion sort to sort the list E, X, A, M, P, L, E in alphabetical order.

MA/CSSE 473 Day 20. Recap: Josephus Problem

CS 310 Advanced Data Structures and Algorithms

Problem. Input: An array A = (A[1],..., A[n]) with length n. Output: a permutation A of A, that is sorted: A [i] A [j] for all. 1 i j n.

UNIT Name the different ways of representing a graph? a.adjacencymatrix b. Adjacency list

Algorithms and Data Structures (INF1) Lecture 7/15 Hua Lu

e-pg PATHSHALA- Computer Science Design and Analysis of Algorithms Module 14 Component-I (A) - Personal Details

Graph. Vertex. edge. Directed Graph. Undirected Graph

Data Structures Brett Bernstein

COMP 3403 Algorithm Analysis Part 2 Chapters 4 5. Jim Diamond CAR 409 Jodrey School of Computer Science Acadia University

Elementary Graph Algorithms. Ref: Chapter 22 of the text by Cormen et al. Representing a graph:

Chapter 3:- Divide and Conquer. Compiled By:- Sanjay Patel Assistant Professor, SVBIT.

Lecture 3. Brute Force

CSC Design and Analysis of Algorithms. Lecture 7. Transform and Conquer I Algorithm Design Technique. Transform and Conquer

Introduction. e.g., the item could be an entire block of information about a student, while the search key might be only the student's name

CS302 - Data Structures using C++

Chapter 3: The Efficiency of Algorithms. Invitation to Computer Science, C++ Version, Third Edition

CS61B Lectures # Purposes of Sorting. Some Definitions. Classifications. Sorting supports searching Binary search standard example

U.C. Berkeley CS170 : Algorithms Midterm 1 Lecturers: Alessandro Chiesa and Satish Rao September 18, Midterm 1

CSI 604 Elementary Graph Algorithms

MA/CSSE 473 Day 20. Student Questions

CSCE 411 Design and Analysis of Algorithms

DIVIDE & CONQUER. Problem of size n. Solution to sub problem 1

Chapter 3: The Efficiency of Algorithms

Lesson 2 7 Graph Partitioning

UNIT 5 GRAPH. Application of Graph Structure in real world:- Graph Terminologies:

Graphs. The ultimate data structure. graphs 1

Unit-2 Divide and conquer 2016

LECTURE NOTES OF ALGORITHMS: DESIGN TECHNIQUES AND ANALYSIS

Dynamic-Programming algorithms for shortest path problems: Bellman-Ford (for singlesource) and Floyd-Warshall (for all-pairs).

Chapter 3: The Efficiency of Algorithms Invitation to Computer Science,

11/8/2016. Chapter 7 Sorting. Introduction. Introduction. Sorting Algorithms. Sorting. Sorting

Module 5 Graph Algorithms

CSC Design and Analysis of Algorithms

SAMPLE OF THE STUDY MATERIAL PART OF CHAPTER 6. Sorting Algorithms

CS2223: Algorithms Sorting Algorithms, Heap Sort, Linear-time sort, Median and Order Statistics

CSC Design and Analysis of Algorithms. Lecture 6. Divide and Conquer Algorithm Design Technique. Divide-and-Conquer

Scan and Quicksort. 1 Scan. 1.1 Contraction CSE341T 09/20/2017. Lecture 7

( ) ( ) C. " 1 n. ( ) $ f n. ( ) B. " log( n! ) ( ) and that you already know ( ) ( ) " % g( n) ( ) " #&

Topics. Trees Vojislav Kecman. Which graphs are trees? Terminology. Terminology Trees as Models Some Tree Theorems Applications of Trees CMSC 302

Practice Problems for the Final

Practical Session No. 12 Graphs, BFS, DFS, Topological sort

Fundamental mathematical techniques reviewed: Mathematical induction Recursion. Typically taught in courses such as Calculus and Discrete Mathematics.

Computer Science 385 Analysis of Algorithms Siena College Spring Topic Notes: Divide and Conquer


Directed Graphs (II) Hwansoo Han

EECS 2011M: Fundamentals of Data Structures

Solutions to relevant spring 2000 exam problems

Design and Analysis of Algorithms - Chapter 6 6

MAT 7003 : Mathematical Foundations. (for Software Engineering) J Paul Gibson, A207.

Graph and Digraph Glossary

Graph: representation and traversal

Jana Kosecka. Linear Time Sorting, Median, Order Statistics. Many slides here are based on E. Demaine, D. Luebke slides

Lecture 2: Getting Started

COMP Data Structures

The ADT priority queue Orders its items by a priority value The first item removed is the one having the highest priority value

Direct Addressing Hash table: Collision resolution how handle collisions Hash Functions:

GRAPHS Lecture 19 CS2110 Spring 2013

Data Structures. Elementary Graph Algorithms BFS, DFS & Topological Sort

Lecture 10. Elementary Graph Algorithm Minimum Spanning Trees

CS2 Algorithms and Data Structures Note 9

CS2 Algorithms and Data Structures Note 10. Depth-First Search and Topological Sorting

Basic Graph Definitions

PESIT Bangalore South Campus Hosur road, 1km before Electronic City, Bengaluru -100 Department of MCA

A6-R3: DATA STRUCTURE THROUGH C LANGUAGE

CSC 505, Spring 2005 Week 6 Lectures page 1 of 9

08 A: Sorting III. CS1102S: Data Structures and Algorithms. Martin Henz. March 10, Generated on Tuesday 9 th March, 2010, 09:58

Module 2: Classical Algorithm Design Techniques

Unit 6 Chapter 15 EXAMPLES OF COMPLEXITY CALCULATION

Selection (deterministic & randomized): finding the median in linear time

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Sorting lower bound and Linear-time sorting Date: 9/19/17

An Introduction to Graph Theory

SELF-BALANCING SEARCH TREES. Chapter 11

Directed acyclic graphs

Practical Session No. 12 Graphs, BFS, DFS Topological Sort

4. Sorting and Order-Statistics

L.J. Institute of Engineering & Technology Semester: VIII (2016)

CSE 373 Final Exam 3/14/06 Sample Solution

Divide and Conquer Algorithms

Minimum Spanning Trees Ch 23 Traversing graphs

Lecture 3: Graphs and flows

Graph definitions. There are two kinds of graphs: directed graphs (sometimes called digraphs) and undirected graphs. An undirected graph

CSE373: Data Structure & Algorithms Lecture 21: More Comparison Sorting. Aaron Bauer Winter 2014

Transcription:

CS 483 - Data Structures and Algorithm Analysis Lecture VI: Chapter 5, part 2; Chapter 6, part 1 R. Paul Wiegand George Mason University, Department of Computer Science March 8, 2006

Outline 1 Topological Sorting 2 Generating Combinatorial Objects 3 Decrease-by-Constant Factor 4 Variable-Size-Decrease Algorithms 5 Transform & Conquer 6 Gaussian Elimination 7 Homework

DFS & BFS Edge-Types tree edge Edge encountered by the search that leads to an as-yet unvisited node (DFS & BFS) back edge Edge leading to a previously visited vertex other than its immediate predecessor (DFS) cross edge Edge leading to a previously visited vertex other than its immediate predecessor (BFS)

DFS & BFS Edge-Types tree edge Edge encountered by the search that leads to an as-yet unvisited node (DFS & BFS) back edge Edge leading to a previously visited vertex other than its immediate predecessor (DFS) cross edge Edge leading to a previously visited vertex other than its immediate predecessor (BFS) da dc dd df db de

DFS & BFS Edge-Types tree edge Edge encountered by the search that leads to an as-yet unvisited node (DFS & BFS) bc ba bd be back edge Edge leading to a previously visited vertex other than its immediate predecessor (DFS) bf bb cross edge Edge leading to a previously visited vertex other than its immediate predecessor (BFS)

Directed Graphs: A Review A directed graph (digraph) is a graph with directed edges We can use the same representational constructs: adjacency matrices & adjacency lists But there are some differences from the undirected case: Adjacency matrix need not be symmetric An edge in the digraph has only one node in an adjacency list We can still use DFS & BFS to traverse such graphs, but the resulting search forest is often more complicated There are now four edge types tree edge Edge leading to an as-yet unvisited node back edge Edge leading from some vertex to a previously visited ancestor forward edge Edge leading from a previously visited ancestor to some vertex cross edge Remaining edge types

Directed Graphs: More Review a b c d e

Directed Graphs: More Review a b a d c d e b c e Example DFS forest: tree edges back edge forward edge cross edge

Directed Graphs: More Review a b a d c d e b c e Example DFS forest: tree edges back edge forward edge cross edge NOTE: A digraph with no back edges has no directed cycles. We call this a directed acyclic graph (DAG).

Representing Dependencies with DAGs Many real-world situations can be modeled with DAGs Consider problems involving dependencies (e.g., course pre-requisites): CS112 CS211 CS310 MAT125 CS330 CS483 MAT105 MAT113 MAT114

Representing Dependencies with DAGs Many real-world situations can be modeled with DAGs Consider problems involving dependencies (e.g., course pre-requisites): CS112 CS211 CS310 If you could take only one course at a time, what order would you choose? MAT125 CS330 CS483 MAT105 MAT113 MAT114

Representing Dependencies with DAGs Many real-world situations can be modeled with DAGs Consider problems involving dependencies (e.g., course pre-requisites): CS112 CS211 CS310 MAT125 CS330 CS483 MAT105 MAT113 MAT114 If you could take only one course at a time, what order would you choose? More generally: Order the vertices of a DAG such that for every edge, the vertex where the edge starts precedes the vertex where the edge ends?

Representing Dependencies with DAGs Many real-world situations can be modeled with DAGs Consider problems involving dependencies (e.g., course pre-requisites): CS112 CS211 CS310 MAT125 CS330 CS483 MAT105 MAT113 MAT114 If you could take only one course at a time, what order would you choose? More generally: Order the vertices of a DAG such that for every edge, the vertex where the edge starts precedes the vertex where the edge ends? This problem is called topological sorting

Two Algorithms to Sort Topologies Algorithm 1 Apply Depth-First Search Note the order in which the nodes become dead (popped off the traversal stack) Reverse the order; that is your answer Why does this work? When vertex v is popped off the stack, no vertex u with an edge (u, v) can be among the vertices popped of before v (otherwise (u, v) would be a back edge). Algorithm 2 Identify a source in the digraph (node with no in-coming edges) Break ties arbitrarily Record then delete the node, along with all edges from that node Repeat the process on the remaining subgraph When the graph is empty, you are done

Example On Algorithm 2 CS112 CS211 CS310 MAT125 CS330 CS483 MAT105 MAT113 MAT114

Example On Algorithm 2 MAT105 CS112 CS211 CS310 MAT125 CS330 CS483 MAT113 MAT114

Example On Algorithm 2 MAT105 CS112 CS211 CS310 MAT125 CS330 CS483 MAT113 MAT114

Example On Algorithm 2 CS211 CS310 MAT105 CS112 MAT113 MAT125 CS330 CS483 MAT114

Example On Algorithm 2 CS211 CS310 MAT105 CS112 MAT113 MAT125 CS330 CS483 MAT114

Example On Algorithm 2 CS310 CS330 CS483 MAT105 CS112 MAT113 MAT125 CS211 MAT114

Example On Algorithm 2 CS310 CS330 CS483 MAT105 CS112 MAT113 MAT125 CS211 MAT114

Example On Algorithm 2 CS310 CS483 MAT105 CS112 MAT113 MAT125 CS211 MAT114 CS330

Example On Algorithm 2 CS483 MAT105 CS112 MAT113 MAT125 CS211 MAT114 CS330 CS310

Example On Algorithm 2 MAT105 CS112 MAT113 MAT125 CS211 MAT114 CS330 CS310 CS483

Reviewing Combinations & Permutations What is the difference between a combination and a permutation? combination permutation

Reviewing Combinations & Permutations What is the difference between a combination and a permutation? combination The number of ways of picking k unordered outcomes ( from n possibilities. We often write it as n choose k. n ) k = n! k!(n k)! permutation

Reviewing Combinations & Permutations What is the difference between a combination and a permutation? combination The number of ways of picking k unordered outcomes ( from n possibilities. We often write it as n choose k. n ) k = n! k!(n k)! permutation A permutation is a rearrangement of the elements of an ordered list S into a one-to-one correspondence with S itself. n P k = n! (n k)!

Generating a Single Random Permutation PermuteSet(A[0...n 1]) for i 0 to n 2 j RandInt(i + 1,n 1) Swap(A,i,j) Basic operation is RandInt The loop is Θ(n) In general, items in a permutation can be anything We think of them as ordered sets {a 0, a 2,...,a n 1 } But we ll talk about them as a lists of integers for simplicity For example: {1,2,3,4,5,6,7,8}

Generating a Single Random Permutation PermuteSet(A[0...n 1]) for i 0 to n 2 j RandInt(i + 1,n 1) Swap(A,i,j) Basic operation is RandInt The loop is Θ(n) In general, items in a permutation can be anything We think of them as ordered sets {a 0, a 2,...,a n 1 } But we ll talk about them as a lists of integers for simplicity For example: {4,2,3,1,5,6,7,8}

Generating a Single Random Permutation PermuteSet(A[0...n 1]) for i 0 to n 2 j RandInt(i + 1,n 1) Swap(A,i,j) Basic operation is RandInt The loop is Θ(n) In general, items in a permutation can be anything We think of them as ordered sets {a 0, a 2,...,a n 1 } But we ll talk about them as a lists of integers for simplicity For example: {4,7,3,1,5,6,2,8}

Generating a Single Random Permutation PermuteSet(A[0...n 1]) for i 0 to n 2 j RandInt(i + 1,n 1) Swap(A,i,j) Basic operation is RandInt The loop is Θ(n) In general, items in a permutation can be anything We think of them as ordered sets {a 0, a 2,...,a n 1 } But we ll talk about them as a lists of integers for simplicity For example: {4,7,1,3,5,6,2,8}

Generating a Single Random Permutation PermuteSet(A[0...n 1]) for i 0 to n 2 j RandInt(i + 1,n 1) Swap(A,i,j) Basic operation is RandInt The loop is Θ(n) In general, items in a permutation can be anything We think of them as ordered sets {a 0, a 2,...,a n 1 } But we ll talk about them as a lists of integers for simplicity For example: {4,7,1,8,5,6,2,3}

Generating a Single Random Permutation PermuteSet(A[0...n 1]) for i 0 to n 2 j RandInt(i + 1,n 1) Swap(A,i,j) Basic operation is RandInt The loop is Θ(n) In general, items in a permutation can be anything We think of them as ordered sets {a 0, a 2,...,a n 1 } But we ll talk about them as a lists of integers for simplicity For example: {4,7,1,8,2,6,5,3}

Generating a Single Random Permutation PermuteSet(A[0...n 1]) for i 0 to n 2 j RandInt(i + 1,n 1) Swap(A,i,j) Basic operation is RandInt The loop is Θ(n) In general, items in a permutation can be anything We think of them as ordered sets {a 0, a 2,...,a n 1 } But we ll talk about them as a lists of integers for simplicity For example: {4,7,1,8,2,3,5,6}

Generating All Permutations Suppose we want to generate all permutations between 1 and n We can use Decrease-and-Conquer: Given that the n 1 permutations are generated We can generate the n th permutations by inserting n at all possible n positions We start adding right-to-left, then switch when a new perm is processed

Generating All Permutations Suppose we want to generate all permutations between 1 and n We can use Decrease-and-Conquer: Given that the n 1 permutations are generated We can generate the n th permutations by inserting n at all possible n positions We start adding right-to-left, then switch when a new perm is processed Start 1 Insert 2 12 21 right to left Insert 3 123 132 312 321 231 213 right to left left to right

Generating All Permutations Suppose we want to generate all permutations between 1 and n We can use Decrease-and-Conquer: Given that the n 1 permutations are generated We can generate the n th permutations by inserting n at all possible n positions We start adding right-to-left, then switch when a new perm is processed Start 1 Insert 2 12 21 right to left Insert 3 123 132 312 321 231 213 right to left left to right Satisfies minimal-change: each permutation can be obtained from its predecessor by exchanging just two elements

Generating n th Permutation We can get the same ordering of permutations of n elements without generating the smaller permutations We associate a direction with each element in the 2 4 1 permutation: 3 A mobile component is one in which the arrow points to a smaller adjacent value (3 & 4 above, but not 1 & 2) JohnsonTrotter(n) Initialize the first permutation with 1 2 n while there exists a mobile k do Find the largest mobile integer k Swap k and the adjacent integer its arrow points to Reverse the direction of all integers larger than k

Generating n th Permutation We can get the same ordering of permutations of n elements without generating the smaller permutations We associate a direction with each element in the JohnsonTrotter(n) 2 4 1 permutation: 3 A mobile component is one in which the arrow points to a smaller adjacent value (3 & 4 above, but not 1 & 2) Initialize the first permutation with 1 2 n while there exists a mobile k do Find the largest mobile integer k Swap k and the adjacent integer its arrow points to Reverse the direction of all integers larger than k 1 2 3 1 3 2 3 1 2 3 2 1 2 3 1 2 1 3

Generating Subsets Given some universal set: U = {a 1, a 2,, a n }, generate all possible subsets The set of all subsets is called a power set; there are 2 n of them n subsets 0 1 {a 1 } 2 {a 1 } {a 2 } {a 1, a 2 } 3 {a 1 } {a 2 } {a 3 } {a 1, a 2 } {a 1, a 3 } {a 2, a 3 } {a 1, a 2, a 3 }

Generating Subsets Given some universal set: U = {a 1, a 2,, a n }, generate all possible subsets The set of all subsets is called a power set; there are 2 n of them n subsets 0 1 {a 1 } 2 {a 1 } {a 2 } {a 1, a 2 } 3 {a 1 } {a 2 } {a 3 } {a 1, a 2 } {a 1, a 3 } {a 2, a 3 } {a 1, a 2, a 3 } Can represent a set as a binary string: 000 001 010 011 100 101 {a 3 } {a 2 } {a 2,a 3 } {a 1 } {a 1,a 3 }... Is there an equivalent to minimal-change algorithm here?

Generating Subsets Given some universal set: U = {a 1, a 2,, a n }, generate all possible subsets The set of all subsets is called a power set; there are 2 n of them n subsets 0 1 {a 1 } 2 {a 1 } {a 2 } {a 1, a 2 } 3 {a 1 } {a 2 } {a 3 } {a 1, a 2 } {a 1, a 3 } {a 2, a 3 } {a 1, a 2, a 3 } Can represent a set as a binary string: 000 001 010 011 100 101 {a 3 } {a 2 } {a 2,a 3 } {a 1 } {a 1,a 3 }... Is there an equivalent to minimal-change algorithm here? Yes: 000 001 011 010 110 111 101 100 (Gray code)

Fake-Coin Problem You are given n coins, one of which is fake (you don t know which) You are provided a balance scale to compare sets of coins What is an efficient method for identifying the coin?

Fake-Coin Problem You are given n coins, one of which is fake (you don t know which) You are provided a balance scale to compare sets of coins What is an efficient method for identifying the coin? 1 Hold one (or two) coins aside 2 Divide the remainder into two equal halves 3 If they balance, the fake has been set aside 4 Otherwise examine the lighter pile in the same way W (n) = W ( n/2 ) + 1 for n > 1, W (1) = 0 Θ(lg n)

Fake-Coin Problem You are given n coins, one of which is fake (you don t know which) You are provided a balance scale to compare sets of coins What is an efficient method for identifying the coin? 1 Hold one (or two) coins aside 2 Divide the remainder into two equal halves 3 If they balance, the fake has been set aside 4 Otherwise examine the lighter pile in the same way W (n) = W ( n/2 ) + 1 for n > 1, W (1) = 0 Θ(lg n) But wait! This is not the most efficient way. What if you divided into three equal piles?

Multiplication á la Russe We want to compute the product of n and m, two positive integers But we only know how to add and multiple & divide by two If n is even, we can re-write: n m = n 2 2m If n is odd, we can re-write: n m = n 1 2 2m + m We can apply this method iteratively until n = 1

Multiplication á la Russe We want to compute the product of n and m, two positive integers But we only know how to add and multiple & divide by two If n is even, we can re-write: n m = n 2 2m If n is odd, we can re-write: n m = n 1 2 2m + m We can apply this method iteratively until n = 1 n m 50 65 25 130 130 12 260 6 520 3 1040 1040 1 2080 2080 3,250

Median and Selection The selection problem: Find the k th smallest element in a list of n numbers (the k th order statistics) Finding the median is a special case: k = n/2 Brute force: Sort, then select the k th value in the list: O(n lg n) But we can do better: Use the partitioning logic from QuickSort

Median and Selection The selection problem: Find the k th smallest element in a list of n numbers (the k th order statistics) Finding the median is a special case: k = n/2 Brute force: Sort, then select the k th value in the list: O(n lg n) But we can do better: Use the partitioning logic from QuickSort a 1 a s p p a s+1 a n p If s = k then p solves the problem If s > k then the k th smallest element in whole list is the k th smallest element left-side sublist If s < k then the k th smallest element in whole list is the (k s) th smallest element right-side sublist Average: C(n) = C(n/2) + (n 1) Θ(n)

Interpolation Search Like BinarySearch, but more like a telephone book Rather than split the list in half, we interpolate the position based on the key value A[r] v := search key value l := left index r := right index y = mx + b = x = y b x = l + (v A[l])(r l) A[r] A[l] Average case: O(lg lg n) m value A[l] l v x index r

Introduction to Transform & Conquer In many cases, one can transform a problem instance and solve the transformed problem Three variations of this idea are as follows: instance simplicfication Transform problem instance into a simpler or more convenient istance representation change Transform problem instance representations problem reduction Transform problem instance into an instance of different problem

Presorting Many questions about lists can be answered more easily when the list is already sorted The cost of the sort itself should be warranted Example: Element uniqueness

Presorting Many questions about lists can be answered more easily when the list is already sorted The cost of the sort itself should be warranted Example: Element uniqueness Brute force: compare every element against every other element, Θ(n 2 ) Presort & scan: T(n) = Tsort(n) + Tscan = Θ(n lg n) + Θ(n) Θ(n lg n) Example: Search

Presorting Many questions about lists can be answered more easily when the list is already sorted The cost of the sort itself should be warranted Example: Element uniqueness Brute force: compare every element against every other element, Θ(n 2 ) Presort & scan: T(n) = Tsort(n) + Tscan = Θ(n lg n) + Θ(n) Θ(n lg n) Example: Search Example: Computing a mode (most frequent value) Brute force: linear search: Θ(n) Presort & binary search: Θ(n lg n) + Θ(lg n) Θ(n lg n) Presorting does not help with one search, though perhaps it will with many searches on the same list

Presorting Many questions about lists can be answered more easily when the list is already sorted The cost of the sort itself should be warranted Example: Element uniqueness Brute force: compare every element against every other element, Θ(n 2 ) Presort & scan: T(n) = Tsort(n) + Tscan = Θ(n lg n) + Θ(n) Θ(n lg n) Example: Search Brute force: linear search: Θ(n) Presort & binary search: Θ(n lg n) + Θ(lg n) Θ(n lg n) Presorting does not help with one search, though perhaps it will with many searches on the same list Example: Computing a mode (most frequent value) Brute force: scan list an store count in auxiliary list then scan auxiliary list for highest frequency, worst case time Θ(n 2 ) Presort, Longest-run: sort then scan through list looking for the longest run of a value, Θ(n lg n)

Gaussian Elimination Problem: Solve a system of n linear equations with n unknowns a 11 x 1 + a 12 x 2 + +a 1n x n = b 1 a 21 x 1 + a 22 x 2 + +a 2n x n = b 2. a n1 x 1 + a n2 x 2 + +a nnx n = b n This can be written as A x = b Gaussian Elimination first asks us to transform the problem to a different one, one that has the same solution: A x = b The transformation yields a matrix with all zeros below its main diagonal: a 11 a 12 a 1n A 0 a 22 a 2n =.... 0 0 a nn, b = b 1 b 2.. v n

Gaussian Elimination Problem: Solve a system of n linear equations with n unknowns a 11 x 1 + a 12 x 2 + +a 1n x n = b 1 a 21 x 1 + a 22 x 2 + +a 2n x n = b 2. a n1 x 1 + a n2 x 2 + +a nnx n = b n This can be written as A x = b Gaussian Elimination first asks us to transform the problem to a different one, one that has the same solution: A x = b The transformation yields a matrix with all zeros below its main diagonal: a 11 a 12 a 1n A 0 a 22 a 2n =.... 0 0 a nn, b = b 1 b 2.. v n A simple backward substitution method can be used to obtain the solution now!

Gaussian Elimination: Obtaining An Upper-Triangle Coefficient Matrix Solutions to the system are invariant to three elementary operations: Exchange two equations of the system Replace an equation with its nonzero multiple Replace an equation with a sum or difference of this equation and some multiple of another Consider the following example: 2x 1 x 2 + x 3 = 1 4x 1 + x 2 x 3 = 5 2 1 1 1 4 1 1 5 x 1 + x 2 + x 3 = 0 1 1 1 0 row 2 row2 row1 4 2 row3 row3 row1 1 2

Gaussian Elimination: Obtaining An Upper-Triangle Coefficient Matrix Solutions to the system are invariant to three elementary operations: Exchange two equations of the system Replace an equation with its nonzero multiple Replace an equation with a sum or difference of this equation and some multiple of another Consider the following example: 2 1 1 1 0 3 3 3 2x 1 x 2 + x 3 = 1 4x 1 + x 2 x 3 = 5 x 1 + x 2 + x 3 = 0 0 3 2 1 2 1 2 row3 row3 row2 1 2

Gaussian Elimination: Obtaining An Upper-Triangle Coefficient Matrix Solutions to the system are invariant to three elementary operations: Exchange two equations of the system Replace an equation with its nonzero multiple Replace an equation with a sum or difference of this equation and some multiple of another Consider the following example: 2x 1 x 2 + x 3 = 1 4x 1 + x 2 x 3 = 5 2 1 1 1 0 3 3 3 x 1 + x 2 + x 3 = 0 0 0 2 2 Upper-triangle form!

LU Decomposition If we track the row multiples used during Gaussian elimination, we can construct a lower-triagonal matrix (with one s on the diagonal) 1 0 0 L = 2 1 0 1 2 1 2 1 We can also consider the upper-triangular matrix produced by Gaussian elimination, leaving off the b vector: U = 2 1 1 0 3 3 0 0 2 It turns out that A = LU, so we can re-write our original system as LU x = b We can split this into two steps, and solve each with back substitution: L y = b then U x = y Advantage: We can solve many systems with different b vectors in the same way, with minimal additional effort

LU Decomposition If we track the row multiples used during Gaussian elimination, we can construct a lower-triagonal matrix (with one s on the diagonal) 1 0 0 L = 2 1 0 1 2 1 2 1 We can also consider the upper-triangular matrix produced by Gaussian elimination, leaving off the b vector: U = 2 1 1 0 3 3 0 0 2 NOTE: We can actually store L and U in the same matrix to save space. It turns out that A = LU, so we can re-write our original system as LU x = b We can split this into two steps, and solve each with back substitution: L y = b then U x = y Advantage: We can solve many systems with different b vectors in the same way, with minimal additional effort

Matrix Inversion The inverse of a matrix, denoted A, is defined as AA = I, where I is the identity matrix But this can be written as a series of systems of linear equations, A x j = e j where: x j is the j th column of the inverse matrix e j is the j th column of the identity matrix 1 0 0 0 1 0 I =.... 0 0 0... 1 x 11 x 12 x 1n A x 21 x 22 x 2n =.... x n1 x n2... x nn We can compute the LU decomposition of A, then systematically attempt to solve for each column of the inverse If compute a U with zeros on the diagonal, there is no inverse and A is said to be singular

Matrix Inversion e 1 The inverse of a matrix, denoted A 1 0 0, is defined as AA 0 1 0 = I, where I is the I =.... identity matrix 0 0 0... 1 x 1 x 11 x 12 x 1n But this can be written as a series of systems of linear equations, A x j = e j A x 21 x 22 x 2n =.... where: x n1 x n2... x nn x j is the j th column of the inverse matrix e j is the j th column of the identity matrix We can compute the LU decomposition of A, then systematically attempt to solve for each column of the inverse If compute a U with zeros on the diagonal, there is no inverse and A is said to be singular

Matrix Inversion e 2 The inverse of a matrix, denoted A 1 0 0, is defined as AA 0 1 0 = I, where I is the I =.... identity matrix 0 0 0... 1 x 2 x 11 x 12 x 1n But this can be written as a series of systems of linear equations, A x j = e j A x 21 x 22 x 2n =.... where: x n1 x n2... x nn x j is the j th column of the inverse matrix e j is the j th column of the identity matrix We can compute the LU decomposition of A, then systematically attempt to solve for each column of the inverse If compute a U with zeros on the diagonal, there is no inverse and A is said to be singular

Book Topics Skipped in Lecture In section 5.5: Josephus Problem (pp. 182 184) In section 5.6: Search and Insertion in a Binary Search Tree (pp. 188 189) In section 6.2: The GaussElimnation and BetterGaussElimination algorithms in detail (pp. 202 203) Computing a Determinant (pp. 206 207)

Assignments This week s assignments: Section 5.3: Problems 1, 2, & 5 Section 5.4: Problems 1, 2, & 5 Section 5.5: Problems 2 & 4 Section 5.6: Problems 2 & 6 Section 6.1: Problems 1, 5, & 6 Section 6.2: Problems 1, 2, & 7

Project II: Balanced Trees See project description at: http://www.cs.gmu.edu/ pwiegand/cs483/assignments.htm The project will be due by midnight April 7.