PDA s. and Formal Languages. Automata Theory CS 573. Outline of equivalence of PDA s and CFG s. (see Theorem 5.3)

Similar documents
CS5371 Theory of Computation. Lecture 8: Automata Theory VI (PDA, PDA = CFG)

Multiple Choice Questions

Chapter 14: Pushdown Automata

Outline. Language Hierarchy

Closure Properties of CFLs; Introducing TMs. CS154 Chris Pollett Apr 9, 2007.

R10 SET a) Construct a DFA that accepts an identifier of a C programming language. b) Differentiate between NFA and DFA?

CS 44 Exam #2 February 14, 2001

(a) R=01[((10)*+111)*+0]*1 (b) ((01+10)*00)*. [8+8] 4. (a) Find the left most and right most derivations for the word abba in the grammar

CISC 4090 Theory of Computation

JNTUWORLD. Code No: R

CS210 THEORY OF COMPUTATION QUESTION BANK PART -A UNIT- I

UNIT I PART A PART B

Skyup's Media. PART-B 2) Construct a Mealy machine which is equivalent to the Moore machine given in table.


CSE 105 THEORY OF COMPUTATION

Parsing. Top-Down Parsing. Laura Kallmeyer. Winter 2017/18. Heinrich-Heine-Universität Düsseldorf 1 / 19

Derivations of a CFG. MACM 300 Formal Languages and Automata. Context-free Grammars. Derivations and parse trees

Definition 2.8: A CFG is in Chomsky normal form if every rule. only appear on the left-hand side, we allow the rule S ǫ.

Actually talking about Turing machines this time

CSE 105 THEORY OF COMPUTATION

Theory of Computation Dr. Weiss Extra Practice Exam Solutions

Turing Machines. A transducer is a finite state machine (FST) whose output is a string and not just accept or reject.

pp Variants of Turing Machines (Sec. 3.2)

Decision Properties for Context-free Languages

T.E. (Computer Engineering) (Semester I) Examination, 2013 THEORY OF COMPUTATION (2008 Course)

CS21 Decidability and Tractability

Introduction to Computers & Programming

CSC-461 Exam #2 April 16, 2014

Variants of Turing Machines

CT32 COMPUTER NETWORKS DEC 2015

TAFL 1 (ECS-403) Unit- V. 5.1 Turing Machine. 5.2 TM as computer of Integer Function

Context-Free Grammars

I have read and understand all of the instructions below, and I will obey the Academic Honor Code.

Limits of Computation p.1/?? Limits of Computation p.2/??

Formal Grammars and Abstract Machines. Sahar Al Seesi

Pushdown Automata. A PDA is an FA together with a stack.

LL(1) predictive parsing

Limitations of Algorithmic Solvability In this Chapter we investigate the power of algorithms to solve problems Some can be solved algorithmically and

CS311 / MATH352 - AUTOMATA AND COMPLEXITY THEORY

CS 4120 Introduction to Compilers

QUESTION BANK. Formal Languages and Automata Theory(10CS56)

Theory of Computations Spring 2016 Practice Final Exam Solutions

A Characterization of the Chomsky Hierarchy by String Turing Machines

LR Parsing - The Items

Models of Computation II: Grammars and Pushdown Automata

LL(1) predictive parsing

CSE 105 THEORY OF COMPUTATION

Context Free Languages and Pushdown Automata

COMP 330 Autumn 2018 McGill University

1 Parsing (25 pts, 5 each)

Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2016

Decidable Problems. We examine the problems for which there is an algorithm.

CS103 Handout 16 Winter February 15, 2013 Problem Set 6

Compiler Design 1. Bottom-UP Parsing. Goutam Biswas. Lect 6

Lecture Bottom-Up Parsing

Definition: A context-free grammar (CFG) is a 4- tuple. variables = nonterminals, terminals, rules = productions,,

Talen en Compilers. Johan Jeuring , period 2. January 17, Department of Information and Computing Sciences Utrecht University

Syntax Analysis Part I

CSE302: Compiler Design

ECS 120 Lesson 16 Turing Machines, Pt. 2

Lexical and Syntax Analysis. Bottom-Up Parsing

LR(0) Parsers. CSCI 3130 Formal Languages and Automata Theory. Siu On CHAN. Fall Chinese University of Hong Kong 1/31

AUTOMATA THEORY AND COMPUTABILITY

1. Which of the following regular expressions over {0, 1} denotes the set of all strings not containing 100 as a sub-string?

CSE 105 THEORY OF COMPUTATION

Assignment 4 CSE 517: Natural Language Processing

ONE-STACK AUTOMATA AS ACCEPTORS OF CONTEXT-FREE LANGUAGES *

CS Lecture 2. The Front End. Lecture 2 Lexical Analysis

Enumerations and Turing Machines

LLparse and LRparse: Visual and Interactive Tools for Parsing

Computability and Complexity

Context-Free Grammars

Lecture 12: Cleaning up CFGs and Chomsky Normal

CS 314 Principles of Programming Languages

Normal Forms and Parsing. CS154 Chris Pollett Mar 14, 2007.

Example CFG. Lectures 16 & 17 Bottom-Up Parsing. LL(1) Predictor Table Review. Stacks in LR Parsing 1. Sʹ " S. 2. S " AyB. 3. A " ab. 4.

Final Course Review. Reading: Chapters 1-9

PS3 - Comments. Describe precisely the language accepted by this nondeterministic PDA.

CS 314 Principles of Programming Languages. Lecture 3

CS402 Theory of Automata Solved Subjective From Midterm Papers. MIDTERM SPRING 2012 CS402 Theory of Automata

Turing Machines Part Two

3. Syntax Analysis. Andrea Polini. Formal Languages and Compilers Master in Computer Science University of Camerino

Finite Automata Theory and Formal Languages TMV027/DIT321 LP4 2018

LR Parsing. The first L means the input string is processed from left to right.

CS453 : Shift Reduce Parsing Unambiguous Grammars LR(0) and SLR Parse Tables by Wim Bohm and Michelle Strout. CS453 Shift-reduce Parsing 1

Introduction to Syntax Analysis

Parsing. For a given CFG G, parsing a string w is to check if w L(G) and, if it is, to find a sequence of production rules which derive w.

Homework. Announcements. Before We Start. Languages. Plan for today. Chomsky Normal Form. Final Exam Dates have been announced

Context-free grammars

Bottom-Up Parsing. Lecture 11-12

Theory of Computation p.1/?? Theory of Computation p.2/?? A Turing machine can do everything that any computing

Compilation Lecture 3: Syntax Analysis: Top-Down parsing. Noam Rinetzky

Context-Free Grammars

Lecture 8: Deterministic Bottom-Up Parsing

Formal Languages and Compilers Lecture VI: Lexical Analysis

CpSc 421 Final Solutions

An Introduction To Parsing

Parsing III. CS434 Lecture 8 Spring 2005 Department of Computer Science University of Alabama Joel Jones

Bottom-Up Parsing. Lecture 11-12

Formal Languages. Grammar. Ryan Stansifer. Department of Computer Sciences Florida Institute of Technology Melbourne, Florida USA 32901

Transcription:

CS 573 Automata Theory and Formal Languages Professor Leslie Lander Lecture # 20 November 13, 2000 Greibach Normal Form (GNF) Sheila Greibach s normal form (GNF) for a CFG is one where EVERY production has the form A aα, where a T and α V* We saw this construction previously CS 573 L. Lander, 2000 Class 20-8 PDA s Outline of euivalence of PDA s and CFG s CFG to PDA - 1 Given a CFG G = (V, T, S, P), in GNF, a PDA for L(G) is ({},T, V, δ,, S, ) where δ(, a, A) = {(, γ) (A a γ) P } (see Theorem 5.3) CS 573 L. Lander, 2000 Class 20-9

CFG to PDA - 2 Example aabb is derived by S as δ(, a, S) = {(, S), (, AA)} S aaa δ(, b, A) = {(,ε), (,SS)} A bss δ(, a, ) = {(, SA)} A b asa CS 573 L. Lander, 2000 Class 20-10 PDA to CFG Given a PDA, M = (Q, Σ, Γ, δ, 0, 0, F) build a GNF grammar as follows S [ 0, 0, ] for each in Q for each possible choice of transition δ(, a, A) = ( 1, 1 2 m ), choose any set of j in Q and CS 573 L. Lander, 2000 Class 20-12 CFG to PDA - 3 δ(, a, S)={(, S), (, AA)} δ(, b, A)={(, ε), (, SS)} δ(, a, )={(, SA)} (, aabb, S) -- (, abb, S) or (, abb, AA) -- (, bb, S) or (, bb, AA) or fail -- fail or (, b, A) or (, b, SSA) -- (, ε, ε) or (, ε, SS) or fail If the PDA were deterministic, this is the basis of a shift-reduce parser CS 573 L. Lander, 2000 Class 20-11 PDA to CFG-II Create the production rule [,A, m+1 ] a [ 1, 1, 2 ] [ 2, 2, 3 ] [ m 1, m 1, m ][ m, m, m+1 ] For a transition δ(,a,a) = ( 1,ε), you get the rule [, A, 1 ] a The textbook goes through Example 5.3 CS 573 L. Lander, 2000 Class 20-13

Chapter 10 It is clear when a PDA is deterministic and not all CFL s have a deterministic PDA (DPDA) It is not clear when a grammar generates a language that has a DPDA! The compiler grammars in class LR(0) do give CFL s with DPDA s CS 573 L. Lander, 2000 Class 20-14 Results from Chapter 7 Theorem 7.1 Euivalence of 1-way infinite and 2-way infinite tape machines Chapter 10 The more general LR(1) grammars have a deterministic machine that slightly extends a DPDA by allowing a lookahead If we agree to put an extra symbol at the end of the input ($), this recognizer can be simulated with a DPDA The complement of a DPDA language is a DPDA language CS 573 L. Lander, 2000 Class 20-15 One-way infinite and two-way infinite tapes Simulations A two way infinite tape machine can simulate a one-way infinite machine by initially putting a maker to the left of the first tape symbol [using a couple of extra states and transitions] and then not allowing the simulation to move past that maker CS 573 L. Lander, 2000 Class 20-17

One-way infinite and two-way infinite tapes (2) The simulation of a 2-way infinite tape machine by a 1-way infinite tape machine uses 2 tracks The upper track simulates the right half of the tape The lower track has a marker and the rest of it is used to simulate the left half of the tape CS 573 L. Lander, 2000 Class 20-18 Add U to the state when using the lower track Picture original head position Simulation X 1 X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X 10 X $ Y 1 Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9,U CS 573 L. Lander, 2000 Class 20-20 Add D to the state when using the lower track Picture Y 8 Y 7 Y 6 Y 5 Y 4 Y 3 Y 2 Y 1 X 1 X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X 1 original head position Simulation X 1 X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X 10 X $ Y 1 Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y,D CS 573 L. Lander, 2000 Class 20-19 Details Special transitions are needed to flip between (,U) and (,D) when the read head recognizes the marker $ on the lower track Whether you switch between U and D depends on whether the 2-way infinite tape head moves left or right The direction of moves on the lower track are opposite to the moves on the left half of the 2-way infinite track CS 573 L. Lander, 2000 Class 20-21

Example 2-way tape The head moves from left to right past the original start position Y 8 Y 7 Y 6 Y 5 Y 4 Y 3 Y 2 Y 1 X 1 X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X 9 Y 8 Y 7 Y 6 Y 5 Y 4 Y 3 Y 2 Y 1 X 1 X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X " CS 573 L. Lander, 2000 Class 20-22 Summary There need to be four types of special movements at the left end of the simulating tape The transitions are given in the textbook If the head is moving left, you switch from (,U) to (,D) Otherwise the 2-way infinite tape head switches direction and the simulator stays in the U or D states CS 573 L. Lander, 2000 Class 20-24 Example 1-way tape The simulation reverses direction,d $ Y 1 Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y 10,D X 1 X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X 10 X $ Y 1 Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 ",U CS 573 L. Lander, 2000 Class 20-23 Move left past original position The head moves from right to left past the original start position Y 8 Y 7 Y 6 Y 5 Y 4 Y 3 Y 2 Y 1 X 1 X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X 9 Y 8 Y 7 Y 6 Y 5 Y 4 Y 3 Y 2 Y 1 X 1 X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X " CS 573 L. Lander, 2000 Class 20-25

Go from U to D The simulation reverses direction,u,u X 1 X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X 10 X $ Y 1 Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y ",D CS 573 L. Lander, 2000 Class 20-26 Stay on D The simulation reverses direction,d $ Y 1 Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y 10,D X 1 X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X 10 X $ Y 1 Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y ",D CS 573 L. Lander, 2000 Class 20-28 turn round at original position, starting on left The head moves right to original position and then moves left again Y 8 Y 7 Y 6 Y 5 Y 4 Y 3 Y 2 Y 1 X 1 X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X 9 Y 8 Y 7 Y 6 Y 5 Y 4 Y 3 Y 2 Y 1 X 1 X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X " CS 573 L. Lander, 2000 Class 20-27 turn round at original position, starting on right The head moves left to original position and then moves right Y 8 Y 7 Y 6 Y 5 Y 4 Y 3 Y 2 Y 1 X 1 X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X " CS 573 L. Lander, 2000 Class 20-29

Stay on U The simulation reverses direction,u,u X 1 X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X 10 X $ Y 1 Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 ",U CS 573 L. Lander, 2000 Class 20-30 Simulating multitape machines We have already defined multitape machines (assume all tapes 2-way) They can be simulated by a singletape, many track machine A multitape machine has many heads They can move independently left or right or remain stationary, so long as one moves We have to simulate this design with only ONE tape head CS 573 L. Lander, 2000 Class 20-32 Results from Chapter 7 Theorem 7.2 Euivalence of multi-tape and one tape machines Simulate n tapes with 2n tracks The X s represent head positions 1 Y 2 Y 3 Y 4 Y 5 Y 6 Y 7 Y 8 Y 9 Y 10 Y 11 Y 12 Y 13 Y 14 Y 15 Y 16 Y X 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 X 1 V 2 V 3 V 4 V 5 V 6 V 7 V 8 V 9 V 10 V 11 V 12 V 13 V 14 V 15 V 16 X CS 573 L. Lander, 2000 Class 20-33

Sweep left and right Start at the right-most X, sweep left across until the n-th X is seen As you sweep across, record the n tape symbols above the X s Once all n symbols are recorded, find the multitape move (new state, n new symbols and n directions to move) Now move back right, writing the new tape symbols and moving the X s left or right to simulate the moves of the separate heads CS 573 L. Lander, 2000 Class 20-34 Two stack machines We can have a read-only input tape if we copy the input onto a second tape and then run the TM on the second tape Hence we can design a TM to have a read-only input tape and a second read-write tape Now a read-write tape can be cut in two and used as two stacks CS 573 L. Lander, 2000 Class 20-36 Results from Chapter 7 Lemma 7.3 Simulation using a 2-stack machine Comparing the moves Y 8 Y 7 Y 6 Y 5 Y 4 Y 3 Y 2 Y 1 X 1 X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X 1 Tape Y 8 Y 7 Y 6 Y 5 Y 4 Y 3 Y 2 Y 1 X 1 X 2 X 3 X 4 X 5 X 6 X 7 X 8 X 9 X Stacks Y 2 Y 3 Y 4 Y 5 CS 573 L. Lander, 2000 Class 20-37 Y 1 Y 1 X 1 X 2 X 3 Y 2 Y 3 Y 4 X 1 X 2 X 3 X 4

Initial situation a 1 a 2 a 3 a 4 a 5... a n Tape Stacks CS 573 L. Lander, 2000 Class 20-38 a 1 a 2 a 3 a 4 a n Count with blanks A 2-step encoding process shows that the content of the stacks using a multisymbol tape-alphabet can be simulated by a 2-counter machine where, there is an input tape and two stacks that have a at the bottom and only blanks above that y analyzing the number of blanks, the content of a stack can be simulated CS 573 L. Lander, 2000 Class 20-40 Results from Chapter 7 Lemma 7.4, Theorem 7.9 Simulation using counter machines Interesting parts of the encoding If the tape alphabet had symbols 1, 2,..., k-1 then think of those symbols as digits for the numbers with radix k A stack configuration is then just a number For example k = 8, then 1, 2,..., 7 can be identified with 1, 2,, 7 and a particular stack is just some octal number such as 6271773114 CS 573 L. Lander, 2000 Class 20-41

Four counter stacks With 2 stacks, you can work with the representation of 1 2... m as i i i j = i m + ki m-1 + k 2 i m-2 +... +k m-1 i 1, which is represented on one stack as j 1 j CS 573 L. Lander, 2000 Class 20-42 j 2 The move Assume that from i m, the new tape symbol is s on one of the stack i pairs Move blanks from left to right, for 1 blank on left push k blanks on right At end push i s extra blanks on stack The new symbol may go to left or right stack pair 1 CS 573 L. Lander, 2000 Class 20-44 k A lot of counting To determine m, copy from one stack to the other--pop k s and push ONE to the other stack--moving through k states you reach in the state i m. Use m to make move i i CS 573 L. Lander, 2000 Class 20-43 j-i m k j-k One number can represent four numbers The coding so far will give a 4- counter machine--a 2-stack machine is simulated by a 4-counter machine Another step simulates a 4-counter machine by a 2-counter machine Here the counts i, j, k, m on the 4 counter stacks is simulated by the number 2 i 3 j 5 k 7 m on one stack (more details are sketched in the textbook) CS 573 L. Lander, 2000 Class 20-45