S1) It's another form of peak finder problem that we discussed in class, We exploit the idea used in binary search.

Similar documents
SAMPLE OF THE STUDY MATERIAL PART OF CHAPTER 6. Sorting Algorithms

Unit 6 Chapter 15 EXAMPLES OF COMPLEXITY CALCULATION

Sorting (Weiss chapter )

Algorithms. Notations/pseudo-codes vs programs Algorithms for simple problems Analysis of algorithms

Algorithms and Applications

Sorting Algorithms. + Analysis of the Sorting Algorithms

Sorting. Dr. Baldassano Yu s Elite Education

The divide and conquer strategy has three basic parts. For a given problem of size n,

ECE 2574: Data Structures and Algorithms - Basic Sorting Algorithms. C. L. Wyatt

Introduction to Computers and Programming. Today

Basic Data Structures (Version 7) Name:

Problem. Input: An array A = (A[1],..., A[n]) with length n. Output: a permutation A of A, that is sorted: A [i] A [j] for all. 1 i j n.

About this exam review

EECS 2011M: Fundamentals of Data Structures

Lecture 9: Sorting Algorithms

Mergesort again. 1. Split the list into two equal parts

Divide and Conquer Sorting Algorithms and Noncomparison-based

How many leaves on the decision tree? There are n! leaves, because every permutation appears at least once.

For searching and sorting algorithms, this is particularly dependent on the number of data elements.

9/10/2018 Algorithms & Data Structures Analysis of Algorithms. Siyuan Jiang, Sept

Sorting Pearson Education, Inc. All rights reserved.

Sorting. Sorting. Stable Sorting. In-place Sort. Bubble Sort. Bubble Sort. Selection (Tournament) Heapsort (Smoothsort) Mergesort Quicksort Bogosort

Algorithmic Analysis and Sorting, Part Two

University of Toronto Department of Electrical and Computer Engineering. Midterm Examination. ECE 345 Algorithms and Data Structures Fall 2012

Plotting run-time graphically. Plotting run-time graphically. CS241 Algorithmics - week 1 review. Prefix Averages - Algorithm #1

Comparison Sorts. Chapter 9.4, 12.1, 12.2

Lecture 6 Sorting and Searching

CSE 143. Two important problems. Searching and Sorting. Review: Linear Search. Review: Binary Search. Example. How Efficient Is Linear Search?

Searching Algorithms/Time Analysis

Assignment 1. Stefano Guerra. October 11, The following observation follows directly from the definition of in order and pre order traversal:

EXAM ELEMENTARY MATH FOR GMT: ALGORITHM ANALYSIS NOVEMBER 7, 2013, 13:15-16:15, RUPPERT D

CSE 373 Spring Midterm. Friday April 21st

Analysis of Algorithms. Unit 4 - Analysis of well known Algorithms

CS2223: Algorithms Sorting Algorithms, Heap Sort, Linear-time sort, Median and Order Statistics

Lecture #2. 1 Overview. 2 Worst-Case Analysis vs. Average Case Analysis. 3 Divide-and-Conquer Design Paradigm. 4 Quicksort. 4.

Sorting. Bubble Sort. Pseudo Code for Bubble Sorting: Sorting is ordering a list of elements.

Design and Analysis of Algorithms PART III

Chapter 3:- Divide and Conquer. Compiled By:- Sanjay Patel Assistant Professor, SVBIT.

Lecture 19 Sorting Goodrich, Tamassia

Computer Science 4U Unit 1. Programming Concepts and Skills Algorithms

Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, Merge Sort & Quick Sort

Algorithmic Analysis and Sorting, Part Two. CS106B Winter

Lecture 2: Sorting and the Big O. Wednesday, 16 September 2009

Visualizing Data Structures. Dan Petrisko

n 1 x i = xn 1 x 1. i= (2n 1) = n 2.

Analyze the obvious algorithm, 5 points Here is the most obvious algorithm for this problem: (LastLargerElement[A[1..n]:

CISC 1100: Structures of Computer Science

Outline. Computer Science 331. Heap Shape. Binary Heaps. Heap Sort. Insertion Deletion. Mike Jacobson. HeapSort Priority Queues.

MPATE-GE 2618: C Programming for Music Technology. Unit 4.2

COMP2012H Spring 2014 Dekai Wu. Sorting. (more on sorting algorithms: mergesort, quicksort, heapsort)

MergeSort, Recurrences, Asymptotic Analysis Scribe: Michael P. Kim Date: September 28, 2016 Edited by Ofir Geri

Cpt S 122 Data Structures. Sorting

FACULTY OF SCIENCE ACADEMY OF COMPUTER SCIENCE AND SOFTWARE ENGINEERING ADVANCED DATA STRUCTURES AND ALGORITHMS EXAM EXAMINATION JUNE 2014

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Sorting lower bound and Linear-time sorting Date: 9/19/17

Intro to Algorithms. Professor Kevin Gold

What is an algorithm? CISC 1100/1400 Structures of Comp. Sci./Discrete Structures Chapter 8 Algorithms. Applications of algorithms

Elementary maths for GMT. Algorithm analysis Part II

Sorting. Chapter 12. Objectives. Upon completion you will be able to:

Chapter 8 Algorithms 1

Do you remember any iterative sorting methods? Can we produce a good sorting method by. We try to divide and conquer: break into subproblems

17/05/2018. Outline. Outline. Divide and Conquer. Control Abstraction for Divide &Conquer. Outline. Module 2: Divide and Conquer

COMP Data Structures

Outline. Computer Science 331. Three Classical Algorithms. The Sorting Problem. Classical Sorting Algorithms. Mike Jacobson. Description Analysis

Example Algorithms. CSE 2320 Algorithms and Data Structures Alexandra Stefan. University of Texas at Arlington. Last updated 9/7/2016

Binary Search and Worst-Case Analysis

Quicksort (Weiss chapter 8.6)

Binary Search and Worst-Case Analysis

Data Structures. Sorting. Haim Kaplan & Uri Zwick December 2013

Classic Data Structures Introduction UNIT I

Priority Queues. T. M. Murali. January 29, 2009

CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis. Aaron Bauer Winter 2014

CSCI-1200 Data Structures Fall 2018 Lecture 23 Priority Queues II

MergeSort, Recurrences, Asymptotic Analysis Scribe: Michael P. Kim Date: April 1, 2015

Lecture 15: Algorithms. AP Computer Science Principles

Sorting. Riley Porter. CSE373: Data Structures & Algorithms 1

8 Algorithms 8.1. Foundations of Computer Science Cengage Learning

Quick Sort. CSE Data Structures May 15, 2002

Sorting & Growth of Functions

Sorting. Order in the court! sorting 1

Mergesort again. 1. Split the list into two equal parts

CS/ENGRD 2110 Object-Oriented Programming and Data Structures Spring 2012 Thorsten Joachims. Lecture 10: Asymptotic Complexity and

Fundamental problem in computing science. putting a collection of items in order. Often used as part of another algorithm

Course Review. Cpt S 223 Fall 2009

Sorting Goodrich, Tamassia Sorting 1

Sorting. Hsuan-Tien Lin. June 9, Dept. of CSIE, NTU. H.-T. Lin (NTU CSIE) Sorting 06/09, / 13

Sorting: Given a list A with n elements possessing a total order, return a list with the same elements in non-decreasing order.

COMP250: Loop invariants

Exam Data structures DAT036/DAT037/DIT960

Selection, Bubble, Insertion, Merge, Heap, Quick Bucket, Radix

UNIT-2. Problem of size n. Sub-problem 1 size n/2. Sub-problem 2 size n/2. Solution to the original problem

Cpt S 223 Course Overview. Cpt S 223, Fall 2007 Copyright: Washington State University

Data Structures and Algorithms (DSA) Course 12 Trees & sort. Iulian Năstac

COE428 Lecture Notes Week 1 (Week of January 9, 2017)

COMP 352 FALL Tutorial 10

CS Algorithms and Complexity

A loose end: binary search

CP222 Computer Science II. Searching and Sorting

Unit-2 Divide and conquer 2016

CS 506, Sect 002 Homework 5 Dr. David Nassimi Foundations of CS Due: Week 11, Mon. Apr. 7 Spring 2014

Binary heaps (chapters ) Leftist heaps

Transcription:

Q1) Given an array A which stores 0 and 1, such that each entry containing 0 appears before all those entries containing 1. In other words, it is like {0, 0, 0,..., 0, 0, 1, 1,..., 111}. Design an algorithm to find out the small index i in the array A such that A[i] = 1 using c log n instructions in the worst case for some positive constant c. S1) It's another form of peak finder problem that we discussed in class, We exploit the idea used in binary search. Here is the function to implement the binary search method.

Q2)Theorem All horses are the same color Proof by induction Base Case: n = 1, then there is only one horse in the set, and it must consequently have the same color as itself. Inductive Step: Let us assume for n 1 that all horses in a set of size n have the same color. FInal step: suppose we have a set of n + 1 horses: { h 1, h 2,..., h n, h n+ 1 }. We could break this up into to two sets, { h 1, h 2,..., h n } and { h 2,..., h n, h n+ 1 }. Each of these is a set of size n and thus all horses in each, by assumption, are the same color. As there is an overlap between the two sets, it follows that all the horses in both sets must be the same color, and thus, all the horses in {h 1, h 2,..., h n, h n+ 1 } have the same color. Choose the correct option: A) I always knew that Math is unreliable B) Induction is useful for numbers C) Induction doesn't always work D) All the other options are wrong. The solution given is wrong because S2) FInal step: suppose we have a set of n + 1 horses: { h 1, h 2,..., h n, h n+ 1 }. We could break this up into to two sets, { h 1, h 2,..., h n } and { h 2,..., h n, h n+ 1 }. Each of these is a set of size n and thus all horses in each, by assumption, are the same color. As there is an overlap between the two sets, it follows that all the horses in both sets must be the same color, and thus, all the horses in {h 1, h 2,..., h n, h n+ 1 } have the same color. The underlined part in the final step should is indicative of the fact that there is something wrong in this solution: intuitively you should realize the reasoning given (the highlighted part) is not correct. For example, what if { h 1, h 2,..., h n } and h n+ 1 is be a black horse? Base Case: n = 1, then there is only one horse in the set, and it must consequently have the same color as itself. Inductive Step: Let us assume for n 1 that all horses in a set of size n have the same color This inductive step is not justified for this base case. Therefore, the inductive step is wrong. Therefore, the final step is wrong

The proof is wrong because the base case is wrong. You can prove amazing things if the base case is wrong.

Q3) Here are three sorting algorithm listed below with basic explanation and Pseudocode, read and answer following questions. Insertion Insertion sort iterates, consuming one input element each repetition, and growing a sorted output list. Each iteration, insertion sort removes one element from the input data, finds the location it belongs within the sorted list, and inserts it there. It repeats until no input elements remain. Pseudocode: BogoSort for i = 1 to length(a) - 1 end for j = i while j > 0 and A[j-1] > A[j] swap A[j] and A[j-1] j = j - 1 end while If bogosort were used to sort a deck of cards, it would consist of checking if the deck were in order, and if it were not, throwing the deck into the air, picking the cards up at random, and repeating the process until the deck is sorted Pseudocode: while not isinorder(deck): shuffle(deck) Note: deck is your unordered array containing n elements. Merge Sort Conceptually, a merge sort works as follows: 1. Divide the unsorted list into n sublists, each containing 1 element (a list of 1 element is considered sorted). 2. Repeatedly merge sublists to produce new sorted sublists until there is only 1 sublist remaining. This will be the sorted list.

Pseudocode: a is an array containing n elements. func mergesort( var a as array ) if ( n == 1 ) return a var l1 as array = a[0]... a[n/2] var l2 as array = a[n/2+1]... a[n] l1 = mergesort( l1 ) l2 = mergesort( l2 ) return merge( l1, l2 ) end func func merge( var a as array, var b as array ) var c as array while ( a and b have elements ) if ( a[0] > b[0] ) add b[0] to the end of c remove b[0] from b else add a[0] to the end of c remove a[0] from a while ( a has elements ) add a[0] to the end of c remove a[0] from a while ( b has elements ) add b[0] to the end of c remove b[0] from b return c end func Question:

What are the time complexities of these algorithms in worst case in the form of big omega? For bogosort, find the average case. For n=2, n=5 and n=100, rate these three algorithms from fastest to slowest S3) Answer: insertion: O(n^2), bogosort: O((n+1)!), mergesort: O(n*log(n)) n=2 merge<bogosort<insertion n=5 merge<insertion<bogosort n=100 merge<insertion<bogosort

Q4) Graphs and Centrality 3 4 4 2 3 3 4 3 Centrality is an indicator of how connected a node in a graph is. One definition of a node's centrality is defined as the largest number of edges that need to be crossed from the node to reach any other node on the graph using the shortest path to the node. A lower value means the node is more central. Create the procedure (or pseudocode) for calculating the centrality of a node from a graph.

S4) 1. Create a set of visited and adjacent (or not visited) nodes, and an integer value x. 2. Assign all nodes a temporary value (INT_MAX or similar large). 3. Put the start node in the set of adjacent nodes with a new value of 0. 4. Loop while there are nodes in the adjacent set: a. Pick a node from the adjacent set. b. Assign a variable n to be the value of the node. c. If n is greater than x, set x to be n. d. Remove node from adjacent set and add to visited set. e. For each node adjacent to this node: i. If the node is in the visited set with a value less than n+1, go to next node. ii. If the node is in the visited set with a value greater than n+1: 1. Remove the node from the visited set. iii. Add the node to the adjacent set with a value of n+1. 5. The start node s centrality value is x.

Q5) a binary tree is a data structure that facilitates searching and inserting into a data set. A common problem that programmers encounter is to poorly balance a binary tree as shown in Fig 5.1. To fully understand the importance of balancing a binary tree, answer the following questions: a) What is the worst case performance of finding an element in a balanced binary tree and of an extremely unbalanced binary tree*? (in big-o-notation) b) What data structure does an extremely unbalanced binary tree* resemble? *extremely unbalanced tree means that all data points are always inserted into either the left or right of the data point before Fig 5.1: balanced vs unbalanced binary trees S5) a) O(log n) vs O(n) b) singly linked list