Lecture 7: Primitive Recursion is Turing Computable. Michael Beeson

Similar documents
We can create PDAs with multiple stacks. At each step we look at the current state, the current input symbol, and the top of each stack.

Recursively Enumerable Languages, Turing Machines, and Decidability

CS 125 Section #4 RAMs and TMs 9/27/16

Arithmetic-logic units

Introduction to Computers & Programming

CSC 220: Computer Organization Unit 10 Arithmetic-logic units

Turing Machine Languages

Turing Machine as Transducer. Turing Machines. Computation with Turing Machines. Computation with Turing Machines. Computation with Turing Machines

CpSc 421 Final Solutions

Recursive Functions. Recursive functions are built up from basic functions by some operations.

Name: CS 341 Practice Final Exam. 1 a 20 b 20 c 20 d 20 e 20 f 20 g Total 207

Does this program exist? Limits of computability. Does this program exist? A helper function. For example, if the string program is

Turing Machines. A transducer is a finite state machine (FST) whose output is a string and not just accept or reject.

p x i 1 i n x, y, z = 2 x 3 y 5 z

Material from Recitation 1

Lecture 4 CSE July 1992

15-122: Principles of Imperative Computation, Fall 2015

Prim Rec, Decidable, Undecidable, and Beyond. William Gasarch-U of MD

6.1 Evaluate Roots and Rational Exponents

We will show that the height of a RB tree on n vertices is approximately 2*log n. In class I presented a simple structural proof of this claim:

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Sorting lower bound and Linear-time sorting Date: 9/19/17

Theory of Languages and Automata

COMPUTABILITY THEORY AND RECURSIVELY ENUMERABLE SETS

TAFL 1 (ECS-403) Unit- V. 5.1 Turing Machine. 5.2 TM as computer of Integer Function

From Theorem 8.5, page 223, we have that the intersection of a context-free language with a regular language is context-free. Therefore, the language

introduction to Programming in C Department of Computer Science and Engineering Lecture No. #40 Recursion Linear Recursion

Variants of Turing Machines

CS402 - Theory of Automata Glossary By

CS21 Decidability and Tractability

Solution Set 8 Date: 1 December, 1992

6.001 Notes: Section 6.1

2.8 Universal Turing Machines and the Halting Problem

CSE 120. Computer Science Principles

Automata Theory CS F-14 Counter Machines & Recursive Functions

CS2 Algorithms and Data Structures Note 10. Depth-First Search and Topological Sorting

/633 Introduction to Algorithms Lecturer: Michael Dinitz Topic: Priority Queues / Heaps Date: 9/27/17

Enumerations and Turing Machines

Lecture 5: The Halting Problem. Michael Beeson

Theory Bridge Exam Example Questions Version of June 6, 2008

The Lambda Calculus. notes by Don Blaheta. October 12, A little bondage is always a good thing. sk

This lecture presents ordered lists. An ordered list is one which is maintained in some predefined order, such as alphabetical or numerical order.

Denotational semantics

Functions in MIPS. Functions in MIPS 1

Part II Composition of Functions

1.3 Primitive Recursive Predicates and Bounded Minimalization

COPYRIGHTED MATERIAL. An Introduction to Computers That Will Actually Help You in Life. Chapter 1. Memory: Not Exactly 0s and 1s. Memory Organization

6.001 Notes: Section 1.1

Figure 4.1: The evolution of a rooted tree.

Final Examination: Topics and Sample Problems

(Refer Slide Time: 0:19)

CS Introduction to Data Structures How to Parse Arithmetic Expressions

Recitation 4: Elimination algorithm, reconstituted graph, triangulation

Lecture 2: SML Basics

Catalan Numbers. Table 1: Balanced Parentheses

CS1622. Semantic Analysis. The Compiler So Far. Lecture 15 Semantic Analysis. How to build symbol tables How to use them to find

From the λ-calculus to Functional Programming Drew McDermott Posted

CMPSCI 250: Introduction to Computation. Lecture #14: Induction and Recursion (Still More Induction) David Mix Barrington 14 March 2013

Theory of Computer Science

AXIOMS FOR THE INTEGERS

Divisibility Rules and Their Explanations

Elementary Recursive Function Theory

The fringe of a binary tree are the values in left-to-right order. For example, the fringe of the following tree:

Functional Programming in Haskell Prof. Madhavan Mukund and S. P. Suresh Chennai Mathematical Institute

(Refer Slide Time: 4:00)

1 ICS 161: Design and Analysis of Algorithms Lecture notes for January 23, Bucket Sorting

14.1 Encoding for different models of computation

CS 275 Automata and Formal Language Theory. First Problem of URMs. (a) Definition of the Turing Machine. III.3 (a) Definition of the Turing Machine

Typing Control. Chapter Conditionals

Lexical and Syntax Analysis. Top-Down Parsing

Steven Skiena. skiena

How convincing is our Halting Problem proof? Lecture 36: Modeling Computing. Solutions. DrScheme. What is a model? Modeling Computation

Static Program Analysis CS701

Introduction to the Lambda Calculus

Lecture 5. Announcements: Today: Finish up functions in MIPS

Chapter 4. Operations on Data

Lecture 3: Art Gallery Problems and Polygon Triangulation

CS125 : Introduction to Computer Science. Lecture Notes #11 Procedural Composition and Abstraction. c 2005, 2004 Jason Zych

6.001 Notes: Section 4.1

(a) Give inductive definitions of the relations M N and M N of single-step and many-step β-reduction between λ-terms M and N. (You may assume the

Limitations of Algorithmic Solvability In this Chapter we investigate the power of algorithms to solve problems Some can be solved algorithmically and

Tail Calls. CMSC 330: Organization of Programming Languages. Tail Recursion. Tail Recursion (cont d) Names and Binding. Tail Recursion (cont d)

Assignment II: Calculator Brain

Memory Addressing, Binary, and Hexadecimal Review

1 Parsing (25 pts, 5 each)

function at the given point. Thus, for our previous example we would find that lim f(x) = 0

Introduction to Denotational Semantics. Brutus Is An Honorable Man. Class Likes/Dislikes Survey. Dueling Semantics

Natural Numbers. We will use natural numbers to illustrate several ideas that will apply to Haskell data types in general.

Intro. Scheme Basics. scm> 5 5. scm>

We show that the composite function h, h(x) = g(f(x)) is a reduction h: A m C.

Lec-5-HW-1, TM basics

Proofwriting Checklist

Theory of Computer Science

Fundamentals. Fundamentals. Fundamentals. We build up instructions from three types of materials

COMP 161 Lecture Notes 16 Analyzing Search and Sort

pp Variants of Turing Machines (Sec. 3.2)

Theory of Computations Spring 2016 Practice Final Exam Solutions

Bean's Automatic Tape Manipulator A Description, and Operating Instructions. Jeffrey Bean

Theory of Computer Science. Theory of Computer Science. D4.1 Introduction. D4.2 Basic Functions and Composition. D4.3 Primitive Recursion

CS5371 Theory of Computation. Lecture 8: Automata Theory VI (PDA, PDA = CFG)

MA 1128: Lecture 02 1/22/2018

Transcription:

Lecture 7: Primitive Recursion is Turing Computable Michael Beeson

Closure under composition Let f and g be Turing computable. Let h(x) = f(g(x)). Then h is Turing computable. Similarly if h(x) = f(g 1 (x),...,g n (x)). Proof. In Exercise 7.2, you will show that it suffices to show that h is computable by a 2-tape Turing machine; we call these tapes the main tape and the auxiliary tape. Our new machine M has instructions that do on the auxiliary tape what the machine for g does, while not changing anything on the main tape. Its states all have numbers increased from those in the machine for g by some large number 2N, more than the number N of states in the machine for f. M also will have states that do what the machine for f does, but on the main tape, ignoring the auxiliary tape. These states all have their numbers increased by N; so they still do not overlap the states we already put into M.

Closure under composition, continued Now we add some new states, which cause the following behavior: when M starts, it copies x from the main tape to the auxiliary tape, then transfers control to the start state of g. When (and if) the machine for g terminates, instead of stopping, M will copy the contents of the auxiliary tape (up to the first blank) to the main tape, erasing the auxiliary tape as it goes, and then go back to the left end of the main tape and transfer control to the machine for f. When (and if) the machine for f terminates, M will halt. That completes the proof for simple composition. Generalized composition is treated the same way, but using a machine with n auxiliary tapes, on which we store the values of g 1 (x),...,g n (x).

Closure under primitive recursion To be proved: Let g and h be Turing computable, and suppose f is defined by f(x,0) = g(x) f(x,n+1) = h(x,f(x,n)) Then f is Turing computable (in unary notation or in binary notation). We suppose we have Turing machines for g and h. We rename the states so there are no state names in common between these machines. We will construct a machine for f.

Constructing a machine M for f M should compute f(x,n) when it is started at any square on the tape, where x followed by n is written (using blank to separate the arguments), without ever altering anything to the left of the initial square. Thus parts of another computation that have been left there will not be disturbed. We will need to assume that there is a symbol, say #, in the alphabet that is not used in the machines for h and g. In Exercise 7.1, you will show how to reduce the size of the alphabet if required. We are using the tape to store the call stack for recursion, with # to delimit the entries.

An example Suppose g(x) = 4 and h(x,4) = 3 and h(x,3) = 5. Then f(x,2) = h(x,f(x,1)) = h(x,h(x,f(x,0))) = h(x,h(x,g(x))) = h(x,h(x,4)) = h(x,3) = 5

Example continued We want to see the following tapes occur in the computation by M. ^ indicates head position. x stands for specific input. x 11 state 0 ^ x#x 1 state 0 again ^ x#x#x enter start state of machine for g ^ x#x#1111 after computing g(x) = 4 ^ x#x 1111 enter start state of machine for h ^ erase # to pop stack x#111 after computing h(x,4) = 3 ^ x 111 enter start state of machine for h ^ stack popped again 11111 after computing h(x,3) = 5

The plan for M The machine M works as follows: it moves right across the inputs x = x 1,... x m. After M crosses the m-th blank it leaves a marker # in the position of that m-th blank. ( Push ) Then it copies the entire string x to the right of #, in the process moving the digits of n over to the right to make room (so in essence inserting x between # and n). Then it subtracts 1 from n (which is very easy in unary notation; computation in binary representation is discussed below) and moves back to the # marker, and then one square to the right. Now we ve reduced n by one, and next we ll either recurse or compute the base case.

After reducing n by one Now machine M (reading the square one to the right of #) tests whether it sees a blank or not. If it does not see a blank, then n has not been reduced to zero, so it enters its own start state (to recurse). If it does see a blank, that means that the subtraction has reduced n to zero (the empty string). Then M does the following: enters a state designed to remember that it saw a blank moves left across the input x (passing m 1 blanks) and continuing left until it encounters a blank or a # then moves right one square and enters the start state of the machine for g. If the machine for g terminates, then M moves left and erases the # (popping the stack). Then if what is to the left is not blank, it means that the computation of g finally completed a recursive call to f. So to the left we have x of the calling environment.

After completing a recursive call Then M moves left, passing m 1 blanks, until it encounters either another blank (the m-th one) or another #. It moves one square to the right. Then it has x followed by the already-computed value of f(x,n) to the right. It enters the start state of h. When the machine for h terminates, M does the same as when the machine for g terminates.

Recognizing we re done After g or h terminates, if there is a blank to the left, that was the toplevel call. So we re done. That completes the description of M.

Correctness proof for M Now we prove by induction on n that if M is started with a tape containing x followed by n, with blanks separating the arguments and a blank to the left of x, with the scanned square the leftmost square of x (the initial square ) then M will terminate with f(x,n) on the tape starting with the scanned square, and without having altered any square to the left of the initial square. The proof is straightforward, following the construction of M. That completes the proof, for unary representation.

Computing in binary In order to compute f in binary representation, there is only one change required: that is in the part where we need to decrement n, where n at that point lies to the right of # and nothing is to the right of n. So it will suffice to invoke at that point a Turing machine that computes the predecessor of n (in binary representation) without altering the tape to the left of the initial scanned square. For the existence of such a machine, we cannot appeal to the fact that predecessor is primitive recursive, since we need that machine in the proof of this theorem. Instead, it must be directly coded. You have been asked to do that in one of the exercises.

A programming project Someone could write some nice computer programs as follows: Input: a set of primitive recursive function indices. These wouldn t be integers but strings. Output: TM code for the functions indexed by the inputs, that could be run in a simulator. Second program: to compute TM indices from human-friendly definition syntax. These are really not that difficult. I m surprised they don t seem to exist.

Theorem: Every µ-recursive function is Turing computable We already proved closure under primitive recursion and composition. It only remains to prove that the Turing computable functions are closed under the µ-operator. Suppose f(x) = µkp(x,k) and we have a Turing machine M that computes the representing function of P(x,k), without altering anything to the left of the original input. We show that f is computable (in unary or binary) by the following two-tape machine, which has an auxiliary tape and a main tape. We pad the instructions of M (which is a one-tape machine) to do nothing on the auxiliary tape, and we rename the start state of M to avoid confusion with the start state of our new machine; these instructions are part of our new machine.

Progamming a Turing machine to search Starting in state loop with x on the input tape, where x = x 1,...,x n, and k on the auxiliary tape, we move into the start state of M. Then we allow machine M to run, to try to compute P(x,k). If M terminates, having computed that P(x,k) is true, we are almost finished: we just copy k to the main tape, starting at the original scanned square, and end with an extra blank (or if you like, erase everything on the main tape). Then go to the terminating state. If M terminates, having computed that P(x,k) is false, we increment k on the auxiliary tape, erase the main tape and go to state loop. (We have thus completed one loop in the search, and are ready to begin the loop again with the next value of k.) The incrementing of k on the auxiliary tape can be done either in unary or binary (or decimal for that matter) and our machine will compute f in that same representation.

Summary The starting state of our new machine is loop. If the machine is started with blank main tape and blank auxiliary tape, then it clearly computes f(x) in the sense that the output of f(x) appears on the main tape at termination. You may use the technique of Exercise 7.2 to create a one-tape machine equivalent to this 2-tape machine. That completes the proof. This theorem shows at one blow that LOTS of functions are Turing computable.

Questions Is Ackermann s function Turing computable? Is Ackermann s function µ-recursive?