Complexity 1
Complexity Complexity is related to the mathematical relationship between: 1) some measurement of the task size, and 2) some measurement of the effort required to complete the task. A Good Deal?! You were asked to paint a fence that was 2m high and 100m long, and you were paid $20. It didn t take long and you were happy with the deal. Then you were asked to paint another fence that is 2m high and 400m long for the price of $80. Should you accept it? 2
A Good or Bad Deal?! You were asked to mow a circular lawn of radius 4m for the price of $50. You got it done in no time and you were happy with the deal. You are now asked to mow another circular lawn of radius 8m for the price of $100. Should you accept it? Why or why not? The store charges $2 to fill a balloon of 4cm diameter. You are asked to pay $4 for a balloon of 8cm. Should you agree? Why or why not? 3
Analysing the Effort Painting the fence: The work involved is directly proportional to the area of the fence. Therefore, getting four times as much money to paint a fence that is four times the area is fair. Mowing the lawn: The work involved is directly proportional to the area of the lawn. Therefore, getting twice as much money for the work done on four times the area is not fair. Price of the balloon: The work involved is directly proportional to the volume and not the diameter. The volume of the 8cm diameter balloon is (4/3 pi * 4^3) and the volume of the 4cm diameter balloon is (4/3 pi * 2^3). 4
Note:- Complexity ignores constants in the relationship lower-order terms in the relationship Complexity is represented by Big O notation The complexity of painting the fence with respect to task size : the length of the fence n and effort size : the time taken to paint it O(n) The complexity of blowing up the balloons with respect to task size : the diameter of the balloons and effort size : the effort to blow them up O(n^3) 5
Different Complexity Values Here are some examples of different complexity values From the least to the most complex. Big-OH Informal Name O(k) constant O(log n) logarithmic O(n) linear O(n log n) n log n O(n^2) quadratic O(n^3) cubic O(2^n) exponential 6
Complexity Analysis Basic rules: Complexity is N x M, where N = number of recursive calls needed to terminate the function M = total amount of work inside each recursive call The complexity of (rec-func1.rec-func2) is the maximum complexity of rec-func1 and rec-func2 When giving a complexity value, the measurements on which it is based must also be given otherwise it is meaningless. 7
x ++ y append [ ] list = list append (x:xs) list = x: append xs list append [1,2,3] [4,5] = 1 : append [2,3] [4,5] = 1 : (2: append [3] [4,5]) = 1 : (2: (3: [4,5])) = 1 : (2 : [3,4,5]) = 1 : [2,3,4,5] = [1,2,3,4,5] N = n which is the number of input M = constant Therefore, O(n) with respect to the left hand argument O(k) with respect to right hand argument 8
Graph Related to Complexity 9
How bad is exponential complexity? n 2^n 1 2 10 1024 20 1048576 50 1125899906842624 100 1267650600228229401496703205376 Number of nodes Number of edge 0 node 0 edge 1 node 0 edge 2 node 1 edge 3 nodes 3 edges 4 nodes 6 edges 5 nodes 10 edges 6 nodes 15 edges etc. etc. numberofedges 1 = 0 numberofedges n = n-1 + numberofedges (n-1) 10
How many steps it would take (in the worst case) to decide if 15 is in a list such as this: [23,18,40,10,20,8,15,32,25,37,45] What if the list was sorted? (or in a BINARY SEARCH TREE) In the 1 st step we remove m = n/2 possibilities from the search space. In the 2 nd step we remove m/2 (i.e. (n/2)/2) possibilities Etc In n steps (worst case), we remove ( (((n/2)/2)/2)/2/2 /2) which is log n. Very fast search. Of course, it assumes the tree is a binary search tree (the original list has been sorted) 11
Complexity of programs p1 x = 2 * x Plot assume constant time for mult. Therefore, O(k) p2 x = (5 * 8) + (6 * x) Plot Therefore O(k) p3 0 = 1 p3 n = n * p3 (n-1) N = n M = constant Therefore O(n) where n is input size 12
p4 0 = 4 * 67 * 2 p4 n = 6 * (n * p4 (n-1)) Plot assume constant time for mult. Therefore, O(n) where n is input size Complexity of reverse rev [ ] = [ ] rev (x:xs) = rev xs ++ [x] N = M = Assume (++) depends on length of left argument. Therefore, O(n^2). 13
Complexity of composed programs p. q. z has complexity of whatever is the highest complexity of p, q, or z For example: (++ [4]). rev has complexity of O(n^2) The following has the same complexity rev. (4:) Why? 14
Complexity Analysis sq n = n * n total number of calls = 1 cost inside every call = 1 O(1) sumlist [ ] = 0 sumlist (e:es) map f [ ] = [] map f (e:es) = (f e) : map f es total number of calls = n cost inside every call = 1 total number of calls = n cost inside every call = 1 O(n) O(n) sumlist. map sq x O(n) fact 0 = 1 fact n = n * fact (n-1) factlist 0 = [1] factlist n = fact n : factlist (n-1) total number of calls = n cost inside every call = 1 total number of calls = n cost inside every call = n O(n) O(n 2 ) 15
Complexity Analysis (++) [ ] ys = ys (++) (x:xs) ys = x : (++) xs ys rev [ ] = [ ] rev (x:xs) = (rev xs) ++ [x] pc 1 [1] pc n = 1:pc (n/2) total number of calls = n cost inside every call = 1 total number of calls = n cost inside every call = n total number of calls = log 2 n cost inside every call 1 O(n) n=#xs O(n 2 ) O(log 2 n) Reducing Complexity O(n 2 ) reverse [ ] = [ ] reverse (x:xs) = reverse xs ++ [x] O(n) reverse xs = reverse xs [ ] reverse [ ] c = c reverse (x:xs) c = reverse xs (x:c) 16
Reducing Time Complexity O(n 2 ) factlist 0 = [1] factlist n = (factorial n) : factlist (n-1) where factorial 0 = 1 factorial n = n * factorial (n-1) O(n) factlist n = factlist2 n (factorial n) where factorial 0 = 1 factorial n = n * factorial (n-1) factlist2 0 inp = [inp] factlist2 n inp = inp : factlist2 (n-1) inp/n 17
Reducing Time Complexity O(n 2 ) binseq [ ] = 0 binseq (b:bs) = b*2^(len bs) + binseq es where len [ ] = 0 len (n:ns) = 1 + len ns O(n) binseq seq = binseq2 seq (len seq) where len [ ] = 0 len (n:ns) = 1 + len ns binseq2 [ ] inp = inp binseq2 (b:bs) inp = b*2^(inp 1) + binseq es (inp 1) 18
Sorting insert e [ ] = [e] insert e (x:xs) = e: (x:xs), if e<=x = x : (insert e xs), otherwise sort [ ] = [ ] sort (x:xs) = insert x (sort xs) Best case : n x 1 = O(n) Worst case : n x n = O(n 2 ) qsort [ ] = [ ] qsort (x:xs) = qsort [ a a<-xs; a <=x] ++ [x] ++ qsort [ a a<-xs; a > x] 19