CSE 105 THEORY OF COMPUTATION

Similar documents
CSE 105 THEORY OF COMPUTATION

CSE 105 THEORY OF COMPUTATION

Variants of Turing Machines

ECS 120 Lesson 16 Turing Machines, Pt. 2

Theory of Languages and Automata

Theory of Computation, Homework 3 Sample Solution

Outline. Language Hierarchy

CISC 4090 Theory of Computation

Equivalence of NTMs and TMs

Midterm Exam II CIS 341: Foundations of Computer Science II Spring 2006, day section Prof. Marvin K. Nakayama

Theory of Computations Spring 2016 Practice Final Exam Solutions

Turing Machines, continued

Closure Properties of CFLs; Introducing TMs. CS154 Chris Pollett Apr 9, 2007.

CS21 Decidability and Tractability

Theory of Computations Spring 2016 Practice Final

pp Variants of Turing Machines (Sec. 3.2)

CSE 105 THEORY OF COMPUTATION

Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2016

Actually talking about Turing machines this time

Turing Machines. A transducer is a finite state machine (FST) whose output is a string and not just accept or reject.

Computer Sciences Department

Theory of Computation p.1/?? Theory of Computation p.2/?? A Turing machine can do everything that any computing

Decidable Problems. We examine the problems for which there is an algorithm.

CSE 105 THEORY OF COMPUTATION

Limitations of Algorithmic Solvability In this Chapter we investigate the power of algorithms to solve problems Some can be solved algorithmically and

I have read and understand all of the instructions below, and I will obey the Academic Honor Code.

CSE 105 THEORY OF COMPUTATION

Lecture 18: Turing Machines

Recursively Enumerable Languages, Turing Machines, and Decidability

Introduction to Computers & Programming

COMP 382: Reasoning about algorithms

CSC-461 Exam #2 April 16, 2014

(a) R=01[((10)*+111)*+0]*1 (b) ((01+10)*00)*. [8+8] 4. (a) Find the left most and right most derivations for the word abba in the grammar

Computability and Complexity

1. (10 points) Draw the state diagram of the DFA that recognizes the language over Σ = {0, 1}

14.1 Encoding for different models of computation

Chapter 14: Pushdown Automata

TAFL 1 (ECS-403) Unit- V. 5.1 Turing Machine. 5.2 TM as computer of Integer Function

Name: CS 341 Practice Final Exam. 1 a 20 b 20 c 20 d 20 e 20 f 20 g Total 207

Finite Automata Part Three

Theory Bridge Exam Example Questions Version of June 6, 2008

Lexical Analysis. Lecture 2-4


1. [5 points each] True or False. If the question is currently open, write O or Open.

The Turing Machine. Unsolvable Problems. Undecidability. The Church-Turing Thesis (1936) Decision Problem. Decision Problems

We can create PDAs with multiple stacks. At each step we look at the current state, the current input symbol, and the top of each stack.

Source of Slides: Introduction to Automata Theory, Languages, and Computation By John E. Hopcroft, Rajeev Motwani and Jeffrey D.

CS5371 Theory of Computation. Lecture 8: Automata Theory VI (PDA, PDA = CFG)

University of Nevada, Las Vegas Computer Science 456/656 Fall 2016

t!1; R 0! 1; R q HALT

Turing Machine Languages

Final Course Review. Reading: Chapters 1-9

UNIT I PART A PART B

Today, we will dispel the notion that Java is a magical language that allows us to solve any problem we want if we re smart enough.

Lexical Analysis. Chapter 2

We show that the composite function h, h(x) = g(f(x)) is a reduction h: A m C.

Introduction to Lexical Analysis

A Characterization of the Chomsky Hierarchy by String Turing Machines

Lexical Analysis. Lecture 3-4

CS 125 Section #10 Midterm 2 Review 11/5/14

Lexical Analysis - 2

Implementation of Lexical Analysis

Theory of Computation Dr. Weiss Extra Practice Exam Solutions

Finite automata. We have looked at using Lex to build a scanner on the basis of regular expressions.

CS 275 Automata and Formal Language Theory. First Problem of URMs. (a) Definition of the Turing Machine. III.3 (a) Definition of the Turing Machine

CT32 COMPUTER NETWORKS DEC 2015

Implementation of Lexical Analysis

Implementation of Lexical Analysis

Lexical Analysis. Implementation: Finite Automata

Compiler course. Chapter 3 Lexical Analysis

Formal Languages and Automata

Ambiguous Grammars and Compactification

6.045J/18.400J: Automata, Computability and Complexity. Practice Quiz 2

Finite Automata Part Three

Turing Machines Part Two

CS 181 B&C EXAM #1 NAME. You have 90 minutes to complete this exam. You may assume without proof any statement proved in class.

Automata & languages. A primer on the Theory of Computation. The imitation game (2014) Benedict Cumberbatch Alan Turing ( ) Laurent Vanbever

Regular Expressions & Automata

CSE450. Translation of Programming Languages. Lecture 20: Automata and Regular Expressions

Recursive and Recursively Enumerable Languages

Regular Languages (14 points) Solution: Problem 1 (6 points) Minimize the following automaton M. Show that the resulting DFA is minimal.

Theory of Programming Languages COMP360

CSE 431S Scanning. Washington University Spring 2013

CS 125 Section #4 RAMs and TMs 9/27/16

CS210 THEORY OF COMPUTATION QUESTION BANK PART -A UNIT- I

1. Draw the state graphs for the finite automata which accept sets of strings composed of zeros and ones which:

CMSC 132: Object-Oriented Programming II

CS402 Theory of Automata Solved Subjective From Midterm Papers. MIDTERM SPRING 2012 CS402 Theory of Automata

PDA s. and Formal Languages. Automata Theory CS 573. Outline of equivalence of PDA s and CFG s. (see Theorem 5.3)

QUESTION BANK. Formal Languages and Automata Theory(10CS56)

Theory of Computation

Does this program exist? Limits of computability. Does this program exist? A helper function. For example, if the string program is

Rice s Theorem and Enumeration

Formal languages and computation models

Turing Machine as Transducer. Turing Machines. Computation with Turing Machines. Computation with Turing Machines. Computation with Turing Machines

Finite Automata. Dr. Nadeem Akhtar. Assistant Professor Department of Computer Science & IT The Islamia University of Bahawalpur

Problems, Languages, Machines, Computability, Complexity

Last lecture CMSC330. This lecture. Finite Automata: States. Finite Automata. Implementing Regular Expressions. Languages. Regular expressions

Exercises Computational Complexity

Chapter 3 Lexical Analysis

Transcription:

CSE 105 THEORY OF COMPUTATION Spring 2016 http://cseweb.ucsd.edu/classes/sp16/cse105-ab/

Today's learning goals Sipser Ch 3.2, 3.3 Define variants of TMs Enumerators Multi-tape TMs Nondeterministic TMs Give high-level description for TMs used in constructions Prove properties of the classes of recognizable and decidable sets. State and use the Church-Turing thesis. Give examples of decidable problems.

Language of a TM Sipser p. 144 L(M) = { w M accepts w} = { w there is a sequence of configurations of M where C 1 is start configuration of M on input w, each C i yields C i+1 and C k is accepting configuration} "The language of M" "The language recognized by M"

Deciders and recognizers Sipser p. 144 Defs 3.5 and 3.6 L is Turing-recognizable if some Turing machine recognizes it. M is a decider TM if it halts on all inputs. L is Turing-decidable if some Turing machine that is a decider recognizes it.

Closure Theorem: The class of decidable languages over fixed alphabet Σ is closed under union. Proof: Let L 1 and L 2 be languages and suppose M 1 and M 2 are TMs deciding these languages. Construct the TM M as "On input w, 1. Run M 1 on input w. If M 1 accepts w, accept. Otherwise, go to 2. 2. Run M 2 on input w. If M 2 accepts w, accept. Otherwise, reject." Correctness of construction: WTS L(M) = L 1 U L 2 and M is a decider.

Variants of TMs Scratch work, copy input, Multiple tapes Parallel computation Nondeterminism Printing vs. accepting Enumerators More flexible transition function Can "stay put" Can "get stuck" lots of examples in exercises to Chapter 3

"Equally expressive" Model 1 Model 2 Model 1 is equally expressive as Model 2 iff A. every language recognized by some machine in Model 1 is recognizable by some machine in Model 2, and B. every language recognized by some machine in Model 2 is recognizable by some machine in Model 1.

Multitape TMs Sipser p. 150 As part of construction of machine, declare some finite number of tapes that are available. Input given on tape 1, rest of the tapes start blank. Each tape has its own read/write head. Transition function Q x Γ k Q x Γ k x {L,R} k Sketch of proof of equivalence: A. Given TM, build multitape TM recognizing same language. B. Given k-tape TM, build (1- tape) TM recognizing same language.

Nondeterministic TMs Sipser p. 152 Transition function Q x Γ P(Q x Γ x {L,R}) Sketch of proof of equivalence: A. Given TM, build nondeterminstic TM recognizing same language. B. Given nondeterministic TM, build (deterministic) TM recognizing same language. Idea: Try all possible branches of nondeterministic computation. 3 tapes: "read-only" input tape, simulation tape, tape tracking nondeterministic braching.

Enumerators What about machines that produce output rather than accept input? Finite State Control a b a b. Computation proceeds according to transition function. At any point, machine may "send" a string to printer. Unlimited tape L(E) = { w E eventually, in finite time, prints w}

Enumerators What about machines that produce output rather than accept input? Finite State Control a b a b. Can L(E) be infinite? A. No, strings must be printed in finite time. B. No, strings must be all be finite length. C. Yes, it may happen if E does not halt. D. Yes, all L(E) Computation are infinite. proceeds E. I don't know. according to transition function. At any point, machine may "send" a string to printer. Unlimited tape L(E) = { w E eventually, in finite time, prints w}

Set of all strings "For each Σ, there is an enumerator whose language is the set of all strings over Σ." A. True B. False C. Depends on Σ. D. I don't know.

Set of all strings "For each Σ, there is an enumerator whose language is the set of all strings over Σ." A. True B. False C. Depends on Σ. D. I don't know. Lexicographic ordering: order strings first by length, then dictionary order. (p. 14)

Recognition and enumeration Sipser Theorem 3.21 Theorem: A language L is Turing-recognizable iff some enumerator enumerates L. Proof: A. Assume L is Turing-recognizable. WTS some enumerator enumerates it. B. Assume L is enumerated by some enumerator. WTS L is Turing-recognizable.

Recognition and enumeration Sipser Theorem 3.21 A. Assume L is Turing-recognizable. WTS some enumerator enumerates it. Let M be a TM that recognizes L. We'll use M in a subroutine for highlevel description of enumerator E. Let s 1, s 2, be a list of all possible strings of Σ*. Define E as follows: E = "On input w, ignore input and repeat the following for each value of i=1,2,3 1. Run M for i steps on each input s 1,, s i 2. If any of the i computations of M accepts, print out the accepted string. Correctness?

Recognition and enumeration Sipser Theorem 3.21 B. Assume the enumerator E enumerates L. WTS L is Turingrecognizable. We'll use E in a subroutine for high-level description of Turing machine M that will recognize L. Define M as follows: M = "On input w, 1. Run E. Every time E prints a string, compare it to w. 2. If w ever appears as the output of E, accept. Correctness?

Variants of TMs Scratch work, copy input, Multiple tapes Parallel computation Nondeterminism Printing vs. accepting Enumerators More flexible transition function Can "stay put" Can "get stuck" lots of examples in exercises to Chapter 3 Also: wildly different models λ-calculus, Post canonical systems, URMs, etc.

Variants of TMs Scratch work, copy input, Multiple tapes Parallel computation Nondeterminism Printing vs. accepting Enumerators More flexible transition function Can "stay put" Can "get stuck" All these models are equally expressive! lots of examples in exercises to Chapter 3 Also: wildly different models λ-calculus, Post canonical systems, URMs, etc.

Algorithm Wikipedia "self-contained step-by-step set of operations to be performed" CSE 20 textbook "An algorithm is a finite sequence of precise instructions for performing a computation or for solving a problem." Church-Turing thesis Each algorithm can be implemented by some Turing machine.

Some algorithms Examples of algorithms / algorithmic problems: 1. Recognize whether a string is a palindrome. 2. Reverse a string. 3. Recognize Pythagorean triples. 4. Compute the gcd of two positive integers. 5. Check whether a string is accepted by a DFA. 6. Convert a regular expression to an equivalent NFA. 7. Check whether the language of a PDA is infinite.

Which of the following is true? A. All these algorithms have inputs of the same type. B. The inputs of each of these algorithms can be encoded as finite strings. C. Some of these problems don't have algorithmic solutions. Some algorithms Examples D. I don't know. of algorithms / algorithmic problems: 1. Recognize whether a string is a palindrome. 2. Reverse a string. 3. Recognize Pythagorean triples. 4. Compute the gcd of two positive integers. 5. Check whether a string is accepted by a DFA. 6. Convert a regular expression to an equivalent NFA. 7. Check whether the language of a PDA is infinite.

Encoding input for TMs Sipser p. 159 By definition, TM inputs are strings To define TM M: "On input w 1... 2... 3. For inputs that aren't strings, we have to encode the object (represent it as a string) first Notation: <O> is the string that represents (encodes) the object O <O 1,, O n > is the single string that represents the tuple of objects O 1,, O n

Encoding inputs Payoff: problems we care about can be reframed as languages of strings e.g. "Recognize whether a string is a palindrome." { w w in {0,1}* and w = w R } e.g. "Recognize Pythagorean triples." { <a,b,c> a,b,c integers and a 2 + b 2 = c 2 } e.g. "Check whether a string is accepted by a DFA." { <B,w> B is a DFA over Σ, w in Σ*, and w is in L(B) } e.g. "Check whether the language of a PDA is infinite." { <A> A is a PDA and L(A) is infinite}

1 variable Hilbert's 10 th problem D = { <p> p is a one-variable polynomial with integer coefficients which has an integer root} What's a high-level description of an algorithm that recognizes this set? M = "On input <p> over the variable x: 1. For i = 0, -1, +1, -2, +2, 2. Calculate p(i). If it evaluates to 0, accept the input. Otherwise, go to next value of i.

1 variable Hilbert's 10 th problem D = { <p> p is a one-variable polynomial with integer coefficients which has an integer root} What's a high-level description of an algorithm that recognizes this set? Is M a decider? A. Yes B. No C. It's not a TM so we can't tell. D. I don't know. M = "On input <p> over the variable x: 1. For i = 0, -1, +1, -2, +2, 2. Calculate p(i). If it evaluates to 0, accept the input. Otherwise, go to next value of i.

Computational problems A computational problem is decidable iff the language encoding the problem instances is decidable

Computational problems Sample computational problems and their encodings: A DFA "Check whether a string is accepted by a DFA." { <B,w> B is a DFA over Σ, w in Σ*, and w is in L(B) } E DFA "Check whether the language of a DFA is empty." { <A> A is a DFA over Σ, L(A) is empty } EQ DFA "Check whether the languages of two DFA are equal." { <A, B> A and B are DFA over Σ, L(A) = L(B)} FACT: all of these problems are decidable!

Proving decidability Claim: A DFA is decidable Proof: WTS that { <B,w> B is a DFA over Σ, w in Σ*, and w is in L(B) } is decidable. Idea: give high-level description. Step 1: construction How would you check if w is in L(B)?

Proving decidability Claim: A DFA is decidable Proof: WTS that { <B,w> B is a DFA over Σ, w in Σ*, and w is in L(B) } is decidable. Idea: give high-level description. Step 1: construction Define TM M by: M 1 = "On input <B,w> 1. Check whether B is a valid encoding of a DFA and w is a valid input for B. If not, reject. 2. Simulate running B on w (by keeping track of states in B, transition function of B, etc.) 3. When the simulation ends, by finishing to process all of w, check current state of B: if it is final, accept; if it is not, reject."

Proving decidability Step 1: construction Define TM M by M 1 = "On input <B,w> 1. Check whether B is a valid encoding of a DFA and w is a valid input for B. If not, reject. 2. Simulate running B on w (by keeping track of states in B, transition function of B, etc.) 3. When the simulation ends, by finishing to process all of w, check current state of B: if it is final, accept; if it is not, reject." Step 2: correctness proof WTS (1) L(M 1 ) = A DFA and (2) M 1 is a decider.

Proving decidability Claim: E DFA is decidable Proof: WTS that { <A> A is a DFA over Σ, L(A) is empty } is decidable. Idea: give high-level description Step 1: construction What condition distinguishes between DFA that accept *some* string and those that don't accept *any*?

Proving decidability Claim: E DFA is decidable Proof: WTS that { <A> A is a DFA over Σ, L(A) is empty } is decidable. Idea: give high-level description Step 1: construction Breadth first search in transition diagram to look for path from state state to an accepting state What condition distinguishes between DFA that accept *some* string and those that don't accept *any*?

Proving decidability Claim: E DFA is decidable Proof: WTS that { <A> A is a DFA over Σ, L(A) is empty } is decidable. Idea: give high-level description Step 1: construction Define TM M 2 by: M 2 = "On input <A>: 1. Check whether A is a valid encoding of a DFA; if not, reject. 2. Mark the state state of A. 3. Repeat until no new states get marked: i. Loop over states of A and mark any unmarked state that has an incoming edge from a marked state. 4. If no final state of A is marked, accept; otherwise, reject.

Proving decidability M will mark A. all the states of the DFA A B. all the states of the DFA A that are reachable from the start state Step 1: construction Define TM M 2 by: M 2 = "On C. some input states <A>: in the DFA more than once D. I don't know. 1. Check whether A is a valid encoding of a DFA; if not, reject. 2. Mark the state state of A. 3. Repeat until no new states get marked: i. Loop over states of A and mark any unmarked state that has an incoming edge from a marked state. 4. If no final state of A is marked, accept; otherwise, reject. Step 2: correctness proof WTS (1) L(M 2 ) = E DFA and (2) M 2 is a decider.

Proving decidability Claim: EQ DFA is decidable Proof: WTS that { <A, B> A, B are DFA over Σ, L(A) = L(B) } is decidable. Idea: give high-level description Step 1: construction Will we be able to simulate A and B? What does set equality mean? Can we use our previous work?

Proving decidability Claim: EQ DFA is decidable Proof: WTS that { <A, B> A, B are DFA over Σ, L(A) = L(B) } is decidable. Idea: give high-level description Step 1: construction Will we be able to simulate A and B? What does set equality mean? Can we use our previous work?

Proving decidability Claim: EQ DFA is decidable Proof: WTS that { <A, B> A, B are DFA over Σ, L(A) = L(B) } is decidable. Idea: give high-level description Step 1: construction Very high-level: Build new DFA recognizing symmetric difference of A, B. Check if this set is empty.

Proving decidability Claim: EQ DFA is decidable Proof: WTS that { <A, B> A, B are DFA over Σ, L(A) = L(B) } is decidable. Idea: give high-level description Step 1: construction Define TM M 3 by: M 3 = "On input <A,B>: 1. Check whether A,B are valid encodings of DFA; if not, reject. 2. Construct a new DFA, D, from A,B using algorithms for complementing, taking unions of regular languages such that L(D) = symmetric difference of A and B. 3. Run machine M 2 on <D>. 4. If it accepts, accept; if it rejects, reject."

Proving decidability Step 1: construction Define TM M 3 by: M 3 = "On input <A,B>: 1. Check whether A,B are valid encodings of DFA; if not, reject. 2. Construct a new DFA, D, from A,B using algorithms for complementing, taking unions of regular languages such that L(D) = symmetric difference of A and B. 3. Run machine M 2 on <D>. 4. If it accepts, accept; if it rejects, reject." Step 2: correctness proof WTS (1) L(M 3 ) = EQ DFA and (2) M 3 is a decider.

Regular Context- Free Decidable Turing- Recognizable

Reminders HW5 due Friday May 6 Additional office hours RQ6 due Monday May 9 Exam 2 Wednesday May 11 (one week from yesterday) Study guide available on TritonEd Same time 8pm-10pm Same place SOLIS 107 New seating chart: see TritonEd Review session Monday May 9, 8pm-10pm, PETERSON 108