What did we talk about last time? Finished hunters and prey Class variables Constants Class constants Started Big Oh notation

Similar documents
Week 12: Running Time and Performance

ALGORITHM ANALYSIS. cs2420 Introduction to Algorithms and Data Structures Spring 2015

Computer Science 210 Data Structures Siena College Fall Topic Notes: Searching and Sorting

ASYMPTOTIC COMPLEXITY

Algorithm Analysis. Big Oh

Computer Science 252 Problem Solving with Java The College of Saint Rose Spring Topic Notes: Searching and Sorting

ECE 2400 Computer Systems Programming Fall 2018 Topic 8: Complexity Analysis

10/5/2016. Comparing Algorithms. Analyzing Code ( worst case ) Example. Analyzing Code. Binary Search. Linear Search

Sum this up for me. Let s write a method to calculate the sum from 1 to some n. Gauss also has a way of solving this. Which one is more efficient?

Analysis of Algorithms. 5-Dec-16

Searching for Information. A Simple Method for Searching. Simple Searching. Class #21: Searching/Sorting I

Analysis of algorithms

ASYMPTOTIC COMPLEXITY

Algorithm Analysis. Performance Factors

Intro to Algorithms. Professor Kevin Gold

Ch 8. Searching and Sorting Arrays Part 1. Definitions of Search and Sort

CMPSCI 187: Programming With Data Structures. Lecture 5: Analysis of Algorithms Overview 16 September 2011

Plot SIZE. How will execution time grow with SIZE? Actual Data. int array[size]; int A = 0;

CS240 Fall Mike Lam, Professor. Algorithm Analysis

CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis. Aaron Bauer Winter 2014

CMPSCI 105: Lecture #12 Searching, Sorting, Joins, and Indexing PART #1: SEARCHING AND SORTING. Linear Search. Binary Search.

CSE373: Data Structure & Algorithms Lecture 21: More Comparison Sorting. Aaron Bauer Winter 2014

Lecture 16. Today: Start looking into memory hierarchy Cache$! Yay!

Recursion. COMS W1007 Introduction to Computer Science. Christopher Conway 26 June 2003

Algorithm Analysis. College of Computing & Information Technology King Abdulaziz University. CPCS-204 Data Structures I

Search Lesson Outline

Resources matter. Orders of Growth of Processes. R(n)= (n 2 ) Orders of growth of processes. Partial trace for (ifact 4) Partial trace for (fact 4)

CSE 373 APRIL 3 RD ALGORITHM ANALYSIS

Computer Science 210 Data Structures Siena College Fall Topic Notes: Complexity and Asymptotic Analysis

Java How to Program, 9/e. Copyright by Pearson Education, Inc. All Rights Reserved.

Variables and Data Representation

Welcome to Part 3: Memory Systems and I/O

Outline. runtime of programs algorithm efficiency Big-O notation List interface Array lists

Algorithmic Complexity

CS 31: Intro to Systems Storage and Memory. Kevin Webb Swarthmore College March 17, 2015

12/1/2016. Sorting. Savitch Chapter 7.4. Why sort. Easier to search (binary search) Sorting used as a step in many algorithms

Lecture 7: Searching and Sorting Algorithms

CS171:Introduction to Computer Science II. Algorithm Analysis. Li Xiong

Cache introduction. April 16, Howard Huang 1

CS 310: Order Notation (aka Big-O and friends)

CS 231 Data Structures and Algorithms Fall Algorithm Analysis Lecture 16 October 10, Prof. Zadia Codabux

Sorting (Weiss chapter )

The Limits of Sorting Divide-and-Conquer Comparison Sorts II

Ex: If you use a program to record sales, you will want to remember data:

ECE 122. Engineering Problem Solving Using Java

CPSC 320: Intermediate Algorithm Design and Analysis. Tutorial: Week 3

What did we talk about last time? Examples switch statements

MPATE-GE 2618: C Programming for Music Technology. Unit 4.2

Garbage Collection (1)

asymptotic growth rate or order compare two functions, but ignore constant factors, small inputs

Definition: A data structure is a way of organizing data in a computer so that it can be used efficiently.

We will stamp HW Block day:

CS240 Fall Mike Lam, Professor. Algorithm Analysis

We will use the following code as an example throughout the next two topics:

Choice of C++ as Language

(Refer Slide Time: 01.26)

CS5112: Algorithms and Data Structures for Applications

KF5008 Algorithm Efficiency; Sorting and Searching Algorithms;

Algorithms. Algorithms 1.4 ANALYSIS OF ALGORITHMS

CS 12 Fall 2003 Solutions for mid-term exam #2

COMP 161 Lecture Notes 16 Analyzing Search and Sort

Data Structures and Algorithms

CHAPTER 3 RESOURCE MANAGEMENT

CSE 146. Asymptotic Analysis Interview Question of the Day Homework 1 & Project 1 Work Session

The Memory Hierarchy 10/25/16

Bits and Bytes. Here is a sort of glossary of computer buzzwords you will encounter in computer use:

Programming II (CS300)

The following content is provided under a Creative Commons license. Your support

week3 Tommy MacWilliam Design GDB week3 Running Time Search Sorting Recursion Practice Problems September 26, 2011

6. Pointers, Structs, and Arrays. 1. Juli 2011

Elementary maths for GMT. Algorithm analysis Part I

Sorting. Sorting in Arrays. SelectionSort. SelectionSort. Binary search works great, but how do we create a sorted array in the first place?

Sorting and Filtering Data

COMP-202 Unit 7: More Advanced OOP. CONTENTS: ArrayList HashSet (Optional) HashMap (Optional)

CS 261 Data Structures. Big-Oh Analysis: A Review

MergeSort, Recurrences, Asymptotic Analysis Scribe: Michael P. Kim Date: April 1, 2015

SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY. Lecture 11 CS2110 Spring 2016

Binary Search Trees Treesort

Exam Questions. How to Answer

CSE 332: Data Structures & Parallelism Lecture 12: Comparison Sorting. Ruth Anderson Winter 2019

Suppose you are accessing elements of an array: ... or suppose you are dereferencing pointers: temp->next->next = elem->prev->prev;

Heap Arrays. Steven R. Bagley

Hi everyone. Starting this week I'm going to make a couple tweaks to how section is run. The first thing is that I'm going to go over all the slides

CS 177 Week 15 Recitation Slides. Review

Week - 04 Lecture - 01 Merge Sort. (Refer Slide Time: 00:02)

For searching and sorting algorithms, this is particularly dependent on the number of data elements.

(Refer Slide Time: 1:27)

Introduction to the Analysis of Algorithms. Algorithm

EECS 203 Spring 2016 Lecture 8 Page 1 of 6

9/10/2018 Algorithms & Data Structures Analysis of Algorithms. Siyuan Jiang, Sept

Objectives. Chapter 23 Sorting. Why study sorting? What data to sort? Insertion Sort. CS1: Java Programming Colorado State University

BINARY SEARCH TREES cs2420 Introduction to Algorithms and Data Structures Spring 2015

CS/ENGRD 2110 Object-Oriented Programming and Data Structures Spring 2012 Thorsten Joachims. Lecture 10: Asymptotic Complexity and

[ 11.2, 11.3, 11.4] Analysis of Algorithms. Complexity of Algorithms. 400 lecture note # Overview

O(1) How long does a function take to run? CS61A Lecture 6

SORTING AND SEARCHING

CSE373: Data Structure & Algorithms Lecture 18: Comparison Sorting. Dan Grossman Fall 2013

CS 163 Practice Final Exam Winter 2012

Summer Final Exam Review Session August 5, 2009

CS 360 Programming Languages Interpreters

Transcription:

Week 12 - Friday

What did we talk about last time? Finished hunters and prey Class variables Constants Class constants Started Big Oh notation

Here is some code that sorts an array in ascending order What is its running time? for( int i = 0; i < array.length; i++ ) for( int j = 0; j < array.length - 1; j++ ) if( array[j] > array[j + 1] ) { int temp = array[j]; array[j] = array[j + 1]; array[j + 1] = temp; } Running time: O(n 2 )

Here is a table of several different complexity measures, in ascending order, with their functions evaluated at n = 100 Description Big Oh f(100) Constant O(1) 1 Logarithmic O(log n) 6.64 Linear O(n) 100 Linearithmic O(n log n) 664.39 Quadratic O(n 2 ) 10000 Cubic O(n 3 ) 1000000 Exponential O(2 n ) 1.27 x 10 30 Factorial O(n!) 9.33 x 10 157

Computers get faster, but not in unlimited ways If computers get 10 times faster, here is how much a problem from each class could grow and still be solvable Description Big Oh Increase in Size Constant O(1) Unlimited Logarithmic O(log n) 1000 Linear O(n) 10 Linearithmic O(n log n) 10 Quadratic O(n 2 ) 3-4 Cubic O(n 3 ) 2-3 Exponential O(2 n ) Hardly changes Factorial O(n!) Hardly changes

There is nothing better than constant time Logarithmic time means that the problem can become much larger and only take a little longer Linear time means that time grows with the problem Linearithmic time is just a little worse than linear Quadratic time means that expanding the problem size significantly could make it impractical Cubic time is about the reasonable maximum if we expect the problem to grow Exponential and factorial time mean that we cannot solve anything but the most trivial problem instances

Memory usage can be a problem If you run out of memory, your program can crash Memory usage can have serious performance consequences too

Remember, there are multiple levels of memory on a computer Each next level is on the order of 500 times larger and 500 times slower Size 500X 500X Cache Actually on the CPU Fast and expensive RAM Primary memory for a desktop computer Pretty fast and relatively expensive Hard Drive Secondary memory for a desktop computer Slow and cheap Speed 500X 500X

If you can do a lot of number crunching without leaving cache, that will be very fast If you have to fetch data from RAM, that will slow things down If you have to read and write data to the hard drive (unavoidable with large pieces of data like digital movies), you will slow things down a lot

Memory can be easier to estimate than running time Depending on your input, you will allocate a certain number of objects, arrays, and primitive data types It is possible to count the storage for each item allocated Remember that a reference to an object or an array costs an additional 4 bytes on top of the size of the object

Here are the sizes of various types in Java Type Bytes boolean 1 char 2 int 4 double 8 Type boolean[] char[] int[] double[] Bytes 16 + N 16 + 2N 16 + 4N 16 + 8N Type Bytes reference 4 String 40 + 2N object 8 + size of members array of objects 16 + (4 + size of members)n Note that the N refers to the number of elements in the array or String

Lets say that I give you a list of numbers, and I ask you, Is 37 on this list? As a human, you have no problem answering this question, as long as the list is reasonably short What if the list is an array, and I want you to write a Java program to find some number?

Easy! We just look through every element in the array until we find it or run out public static int find( int[] array, int number ) { for( int i = 0; i < array.length; i++ ) if( array[i] == number ) return i; } return -1; If we find it, we return the index, otherwise we return -1

Unfortunately for you, we know about Big Oh notation Now we have some way to measure how long this algorithm takes How long, if n is the length of the array? O(n) time because we have to look through every element in the array, in the worst case

Is there any way to go smaller than O(n)? What complexity classes even exist that are smaller than O(n)? O(1) O(log n) Well, on average, we only need to check half the numbers, that s ½ n which is still O(n) Darn

We can do better with more information For example, if the list is sorted, then we can use that information somehow How? We can play a High-Low game

Repeatedly divide the search space in half We re looking for 37, let s say 23 31 37 54 Check Check the Check middle the Check middle the middle the middle (Too (Too low) (Found low) (Too it!) high)

How long can it take? What if you never find what you re looking for? Well, then, you ve narrowed it down to a single spot in the array that doesn t have what you want And what s the maximum amount of time that could have taken?

We cut the search space in half every time At worst, we keep cutting n in half until we get 1 Let s say x is the number of times we look: 1 x n = 1 2 n = 2 x log n = x The running time is O(log n)

We can apply this idea to a guessing game First we tell the computer that we are going to pick a number between 1 and n We pick, and it tries to narrow down the number It should only take log n tries Remember log 2 (1,000,000) is only about 20

This is a classic interview question asked by Microsoft, Amazon, and similar companies Imagine that you have 9 red balls One of them is just slightly heavier than the others, but so slightly that you can t feel it You have a very accurate two pan balance you can use to compare balls Find the heaviest ball in the smallest number of weighings

It s got to be 8 or fewer We could easily test one ball against every other ball There must be some cleverer way to divide them up Something that is related somehow to binary search

We can divide the balls in half each time If those all balance, it must be the one we left out to begin with

How? They key is that you can actually cut the number of balls into three parts each time We weigh 3 against 3, if they balance, then we know the 3 left out have the heavy ball When it s down to 3, weigh 1 against 1, again knowing that it s the one left out that s heavy if they balance

The cool thing is Yes, this is cool in the CS sense, not in the real sense Anyway, the cool thing is that we are trisecting the search space each time This means that it takes log 3 n weighings to find the heaviest ball We could do 27 balls in 3 weighings, 81 balls in 4 weighings, etc.

Sorting

Finish Project 4 Due tonight before midnight!