Overview of Data. 1. Array 2. Linked List 3. Stack 4. Queue

Similar documents
Introduction to Data Structure

1. Attempt any three of the following: 15

Algorithm Analysis. (Algorithm Analysis ) Data Structures and Programming Spring / 48

Data Structures Question Bank Multiple Choice

Chapter 2: Complexity Analysis

Module 1: Asymptotic Time Complexity and Intro to Abstract Data Types

DDS Dynamic Search Trees

PA3 Design Specification

ECE250: Algorithms and Data Structures Midterm Review

Algorithms in Systems Engineering IE172. Midterm Review. Dr. Ted Ralphs

Algorithm. Algorithm Analysis. Algorithm. Algorithm. Analyzing Sorting Algorithms (Insertion Sort) Analyzing Algorithms 8/31/2017

GAUTAM SINGH STUDY MATERIAL Additional Material Unit 24. Design & Analysis of Algorithms. DAA - Introduction

CS:3330 (22c:31) Algorithms

PROGRAM EFFICIENCY & COMPLEXITY ANALYSIS

UNIT-1. Chapter 1(Introduction and overview) 1. Asymptotic Notations 2. One Dimensional array 3. Multi Dimensional array 4. Pointer arrays.

Introduction to Computers & Programming

1 P age DS & OOPS / UNIT II

[ 11.2, 11.3, 11.4] Analysis of Algorithms. Complexity of Algorithms. 400 lecture note # Overview

CS301 - Data Structures Glossary By

CS302 Topic: Algorithm Analysis. Thursday, Sept. 22, 2005

Basic Data Structures (Version 7) Name:

Complexity of Algorithms. Andreas Klappenecker

Data Structures and Algorithms

Complexity Analysis of an Algorithm

Data Structure Lecture#5: Algorithm Analysis (Chapter 3) U Kang Seoul National University

IT 4043 Data Structures and Algorithms

RUNNING TIME ANALYSIS. Problem Solving with Computers-II

asymptotic growth rate or order compare two functions, but ignore constant factors, small inputs

Data Structure. IBPS SO (IT- Officer) Exam 2017

Unit 1 Chapter 4 ITERATIVE ALGORITHM DESIGN ISSUES

UNIT 1 ANALYSIS OF ALGORITHMS

Complexity of Algorithms

Computer Algorithms. Introduction to Algorithm

CSE 373 Spring Midterm. Friday April 21st

INSTITUTE OF AERONAUTICAL ENGINEERING

Algorithm Analysis. Algorithm Efficiency Best, Worst, Average Cases Asymptotic Analysis (Big Oh) Space Complexity

Information Science 2

CH ALGORITHM ANALYSIS CH6. STACKS, QUEUES, AND DEQUES

Stack and Queue. Stack:

Data Structures Lecture 8

Lecture 2: Algorithms, Complexity and Data Structures. (Reading: AM&O Chapter 3)

DESIGN AND ANALYSIS OF ALGORITHMS. Unit 1 Chapter 4 ITERATIVE ALGORITHM DESIGN ISSUES

LIFO : Last In First Out

Linear Data Structures

Ceng 111 Fall 2015 Week 12b

MULTIMEDIA COLLEGE JALAN GURNEY KIRI KUALA LUMPUR

Analysis of Algorithm. Chapter 2

DATA STRUCTURE : A MCQ QUESTION SET Code : RBMCQ0305

CS 6402 DESIGN AND ANALYSIS OF ALGORITHMS QUESTION BANK

Data Structures. Outline. Introduction Linked Lists Stacks Queues Trees Deitel & Associates, Inc. All rights reserved.

EC8393FUNDAMENTALS OF DATA STRUCTURES IN C Unit 3

Algorithms and Data Structures

Computer Science 210 Data Structures Siena College Fall Topic Notes: Complexity and Asymptotic Analysis

How do we compare algorithms meaningfully? (i.e.) the same algorithm will 1) run at different speeds 2) require different amounts of space

O(n): printing a list of n items to the screen, looking at each item once.

How much space does this routine use in the worst case for a given n? public static void use_space(int n) { int b; int [] A;


9/10/2018 Algorithms & Data Structures Analysis of Algorithms. Siyuan Jiang, Sept

Introduction to Computer Science

Position Sort. Anuj Kumar Developer PINGA Solution Pvt. Ltd. Noida, India ABSTRACT. Keywords 1. INTRODUCTION 2. METHODS AND MATERIALS

Algorithmic Analysis. Go go Big O(h)!

Analysis of Algorithms

Logistics. Half of the people will be in Wu & Chen The other half in DRLB A1 Will post past midterms tonight (expect similar types of questions)

Theory and Algorithms Introduction: insertion sort, merge sort

LECTURE 9 Data Structures: A systematic way of organizing and accessing data. --No single data structure works well for ALL purposes.

CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis. Aaron Bauer Winter 2014

Computer Science 210 Data Structures Siena College Fall Topic Notes: Linear Structures

Data Structures and Algorithms CSE 465

Algorithms and Data Structures

Choice of C++ as Language

Plotting run-time graphically. Plotting run-time graphically. CS241 Algorithmics - week 1 review. Prefix Averages - Algorithm #1

Elementary maths for GMT. Algorithm analysis Part I

Algorithm Analysis. Part I. Tyler Moore. Lecture 3. CSE 3353, SMU, Dallas, TX

Section 05: Solutions

NCS 301 DATA STRUCTURE USING C

8/19/2014. Most algorithms transform input objects into output objects The running time of an algorithm typically grows with input size

DEEPIKA KAMBOJ UNIT 2. What is Stack?

III Data Structures. Dynamic sets

CS61BL. Lecture 3: Asymptotic Analysis Trees & Tree Traversals Stacks and Queues Binary Search Trees (and other trees)

DATA STRUCTURES AND ALGORITHMS

4.4 Algorithm Design Technique: Randomization

Sequence Abstract Data Type

Round 1: Basics of algorithm analysis and data structures

CSE 373 APRIL 3 RD ALGORITHM ANALYSIS

12 Abstract Data Types

Why study algorithms? CS 561, Lecture 1. Today s Outline. Why study algorithms? (II)

Chapter 6 INTRODUCTION TO DATA STRUCTURES AND ALGORITHMS

Test 1 SOLUTIONS. June 10, Answer each question in the space provided or on the back of a page with an indication of where to find the answer.

Nested Loops. A loop can be nested inside another loop.

Algorithm Efficiency & Sorting. Algorithm efficiency Big-O notation Searching algorithms Sorting algorithms

Another Sorting Algorithm

Algorithm efficiency can be measured in terms of: Time Space Other resources such as processors, network packets, etc.

CSE 146. Asymptotic Analysis Interview Question of the Day Homework 1 & Project 1 Work Session

CSE 332 Winter 2015: Midterm Exam (closed book, closed notes, no calculators)

Summer Final Exam Review Session August 5, 2009

DATA STRUCTURES AND ALGORITHMS

data structures and algorithms lecture 6

CS350 - Exam 1 (100 Points)

1. Two main measures for the efficiency of an algorithm are a. Processor and memory b. Complexity and capacity c. Time and space d.

CMPT 125: Practice Midterm Answer Key

Transcription:

Overview of Data A data structure is a particular way of organizing data in a computer so that it can be used effectively. The idea is to reduce the space and time complexities of different tasks. Below is an overview of some popular linear data structures. 1. Array 2. Linked List 3. Stack 4. Queue Array Array is a data structure used to store homogeneous elements at contiguous locations. Size of an array must be provided before storing data. : For example, let us say, we want to store marks of all students in a class, we can use an array to store them. This helps in reducing the use of number of variables as we don t need to create a separate variable for marks of every subject. All marks can be accessed by simply traversing the array. Linked List A linked list is a linear data structure (like arrays) where each element is a separate object. Each element (that is node) of a list is comprising of two items the data and a reference to the next node. Types of Linked List : 1. Singly Linked List : In this type of linked list, every node stores address or reference of next node in list and the last node has next address or reference as NULL. For example 1->2- >3->4->NULL 2. Doubly Linked List : In this type of Linked list, there are two references associated with each node, One of the reference points to the next node and one to the previous node. Advantage of this data structure is that we can traverse in both the directions and for deletion we don t need to have explicit access to previous node. Eg. NULL<-1<->2<->3->NULL 3. Circular Linked List : Circular linked list is a linked list where all nodes are connected to form a circle. There is no NULL at the end. A circular linked list can be a singly circular linked list or doubly circular linked list. Advantage of this data structure is that any node can be made as starting node. This is useful in implementation of circular queue in linked list. Eg. 1->2->3->1 [The next pointer of last node is pointing to the first] : Consider the previous example where we made an array of marks of student. Now if a new subject is added in the course, its marks also to be added in the array of marks. But the size of the array was fixed and it is already full so it can not add any new element. If we make an array of a size lot more than the number of subjects it is possible that most of the array will remain empty. We reduce the space wastage Linked List is formed which adds a node only when a new element is introduced. Insertions and deletions also become easier with linked list. Stack A stack or LIFO (last in, first out) is an abstract data type that serves as a collection of elements, with two principal operations: push, which adds an element to the collection, and pop, which removes the last element that was added. In stack both the operations of

push and pop takes place at the same end that is top of the stack. It can be implemented by using both array and linked list. : Stacks are used for maintaining function calls (the last called function must finish execution first), we can always remove recursion with the help of stacks. Stacks are also used in cases where we have to reverse a word, check for balanced parenthesis and in editors where the word you typed the last is the first to be removed when you use undo operation. Similarly, to implement back functionality in web browsers. Queue A queue or FIFO (first in, first out) is an abstract data type that serves as a collection of elements, with two principal operations: enqueue, the process of adding an element to the collection.(the element is added from the rear side) and dequeue, the process of removing the first element that was added. (The element is removed from the front side). It can be implemented by using both array and linked list. : Queue as the name says is the data structure built according to the queues of bus stop or train where the person who is standing in the front of the queue(standing for the longest time) is the first one to get the ticket. So any situation where resources are shared among multiple users and served on first come first server basis. s include CPU scheduling, Disk Scheduling. Another application of queue is when data is transferred asynchronously (data not necessarily received at same rate as sent) between two processes. s include IO Buffers, pipes, file IO, etc. Circular Queue The advantage of this data structure is that it reduces wastage of space in case of array implementation, As the insertion of the (n+1) th element is done at the 0 th index if it is empty. Algorithms In theoretical analysis of algorithms, it is common to estimate their complexity in the asymptotic sense, i.e., to estimate the complexity function for arbitrarily large input. The term "analysis of algorithms" was coined by Donald Knuth. Algorithm analysis is an important part of computational complexity theory, which provides theoretical estimation for the required resources of an algorithm to solve a specific computational problem. Most algorithms are designed to work with inputs of arbitrary length. Analysis of algorithms is the determination of the amount of time and space resources required to execute it. Usually, the efficiency or running time of an algorithm is stated as a function relating the input length to the number of steps, known as time complexity, or volume of memory, known as space complexity.

The Need for Analysis In this chapter, we will discuss the need for analysis of algorithms and how to choose a better algorithm for a particular problem as one computational problem can be solved by different algorithms. By considering an algorithm for a specific problem, we can begin to develop pattern recognition so that similar types of problems can be solved by the help of this algorithm. Algorithms are often quite different from one another, though the objective of these algorithms are the same. For example, we know that a set of numbers can be sorted using different algorithms. Number of comparisons performed by one algorithm may vary with others for the same input. Hence, time complexity of those algorithms may differ. At the same time, we need to calculate the memory space required by each algorithm. Analysis of algorithm is the process of analyzing the problem-solving capability of the algorithm in terms of the time and size required (the size of memory for storage while implementation). However, the main concern of analysis of algorithms is the required time or performance. Generally, we perform the following types of analysis Worst-case The maximum number of steps taken on any instance of size a. Best-case The minimum number of steps taken on any instance of size a. Average case An average number of steps taken on any instance of size a. To solve a problem, we need to consider time as well as space complexity as the program may run on a system where memory is limited but adequate space is available or may be vice-versa. In this context, if we compare bubble sort and merge sort. Bubble sort does not require additional memory, but merge sort requires additional space. Though time complexity of bubble sort is higher compared to merge sort, we may need to apply bubble sort if the program needs to run in an environment, where memory is very limited. In designing of Algorithm, complexity analysis of an algorithm is an essential aspect. Mainly, algorithmic complexity is concerned about its performance, how fast or slow it works. The complexity of an algorithm describes the efficiency of the algorithm in terms of the amount of the memory required to process the data and the processing time. Complexity of an algorithm is analyzed in two perspectives: Time and Space.

Time Complexity It s a function describing the amount of time required to run an algorithm in terms of the size of the input. "Time" can mean the number of memory accesses performed, the number of comparisons between integers, the number of times some inner loop is executed, or some other natural unit related to the amount of real time the algorithm will take. Space Complexity It s a function describing the amount of memory an algorithm takes in terms of the size of input to the algorithm. We often speak of "extra" memory needed, not counting the memory needed to store the input itself. Again, we use natural (but fixed-length) units to measure this. Space complexity is sometimes ignored because the space used is minimal and/or obvious, however sometimes it becomes as important an issue as time. Asymptotic Notations Execution time of an algorithm depends on the instruction set, processor speed, disk I/O speed, etc. Hence, we estimate the efficiency of an algorithm asymptotically. Time function of an algorithm is represented by T(n), where n is the input size. Different types of asymptotic notations are used to represent the complexity of an algorithm. Following asymptotic notations are used to calculate the running time complexity of an algorithm. O Big Oh Ω Big omega θ Big theta o Little Oh ω Little omega O: Asymptotic Upper Bound O (Big Oh) is the most commonly used notation. A function f(n) can be represented is the order of g(n) that is O(g(n)), if there exists a value of positive integer n as n 0 and a positive constant c such that f(n) c.g(n)f(n) c.g(n) for n>n0n>n0 in all case Hence, function g(n) is an upper bound for function f(n), as g(n) grows faster than f(n).

Let us consider a given function, f(n)=4.n3+10.n2+5.n+1f(n)=4.n3+10.n2+5.n+1 Considering g(n)=n3g(n)=n3, f(n) 5.g(n)f(n) 5.g(n) for all the values of n>2n>2 Hence, the complexity of f(n) can be represented as O(g(n))O(g(n)), i.e. O(n3)O(n3) Ω: Asymptotic Lower Bound We say that f(n)=ω(g(n))f(n)=ω(g(n)) when there exists constant c that f(n) c.g(n)f(n) c.g(n)for all sufficiently large value of n. Here n is a positive integer. It means function g is a lower bound for function f; after a certain value of n, f will never go below g. Let us consider a given function, f(n)=4.n3+10.n2+5.n+1f(n)=4.n3+10.n2+5.n+1. Considering g(n)=n3g(n)=n3, f(n) 4.g(n)f(n) 4.g(n) for all the values of n>0n>0. Hence, the complexity of f(n) can be represented as Ω(g(n))Ω(g(n)), i.e. Ω(n3)Ω(n3) θ: Asymptotic Tight Bound We say that f(n)=θ(g(n))f(n)=θ(g(n)) when there exist constants c 1 and c 2 that c1.g(n) f(n) c2.g(n)c1.g(n) f(n) c2.g(n) for all sufficiently large value of n. Here n is a positive integer. This means function g is a tight bound for function f. Let us consider a given function, f(n)=4.n3+10.n2+5.n+1f(n)=4.n3+10.n2+5.n+1 Considering g(n)=n3g(n)=n3, 4.g(n) f(n) 5.g(n)4.g(n) f(n) 5.g(n) for all the large values of n. Hence, the complexity of f(n) can be represented as θ(g(n))θ(g(n)), i.e. θ(n3)θ(n3). O - Notation The asymptotic upper bound provided by O-notation may or may not be asymptotically tight. The bound 2.n2=O(n2)2.n2=O(n2) is asymptotically tight, but the bound 2.n=O(n2)2.n=O(n2) is not. We use o-notation to denote an upper bound that is not asymptotically tight. We formally define o(g(n)) (little-oh of g of n) as the set f(n) = o(g(n)) for any positive constant c>0c>0 and there exists a value n0>0n0>0, such that 0 f(n) c.g(n)0 f(n) c.g(n).

Intuitively, in the o-notation, the function f(n) becomes insignificant relative to g(n) as n approaches infinity; that is, limn (f(n)g(n))=0limn (f(n)g(n))=0 Let us consider the same function, f(n)=4.n3+10.n2+5.n+1f(n)=4.n3+10.n2+5.n+1 Considering g(n)=n4g(n)=n4, limn (4.n3+10.n2+5.n+1n4)=0limn (4.n3+10.n2+5.n+1n4)=0 Hence, the complexity of f(n) can be represented as o(g(n))o(g(n)), i.e. o(n4)o(n4). ω Notation We use ω-notation to denote a lower bound that is not asymptotically tight. Formally, however, we define ω(g(n)) (little-omega of g of n) as the set f(n) = ω(g(n)) for any positive constant C > 0 and there exists a value n0>0n0>0, such that 0 c.g(n)<f(n)0 c.g(n)<f(n). For example, n22=ω(n)n22=ω(n), but n22 ω(n2)n22 ω(n2). The relation f(n)=ω(g(n))f(n)=ω(g(n))implies that the following limit exists limn (f(n)g(n))= limn (f(n)g(n))= That is, f(n) becomes arbitrarily large relative to g(n) as n approaches infinity. Let us consider same function, f(n)=4.n3+10.n2+5.n+1f(n)=4.n3+10.n2+5.n+1 Considering g(n)=n2g(n)=n2, limn (4.n3+10.n2+5.n+1n2)= limn (4.n3+10.n2+5.n+1n2)= Hence, the complexity of f(n) can be represented as o(g(n))o(g(n)), i.e. ω(n2)ω(n2).