COMS 1003 Fall Introduction to Computer Programming in C. History & Computer Organization. September 15 th

Similar documents
Evolution of the Computer

2. Computer Evolution and Performance

Computers in Engineering COMP 208. A Brief History. Mechanical Calculators. A Historic Perspective Michael A. Hawker

Computers in Engineering COMP 208

Computer Systems. Computer Systems. Wolfgang Schreiner Research Institute for Symbolic Computation (RISC-Linz) Johannes Kepler University

ENIAC - background. ENIAC - details. Structure of von Nuemann machine. von Neumann/Turing Computer Architecture

Computer Systems. Hardware, Software and Layers of Abstraction

Introduction to Computer Science. What is Computer Science?

Chapter 2. Perkembangan Komputer

CS 1 Notes 1 - Early Computing and 2 - Electronic Computing

Computer Systems Architecture

Computer Organization CS 206T

CHAPTER 1 Introduction

William Stallings Computer Organization and Architecture 8 th Edition. Chapter 2 Computer Evolution and Performance

1: History, Generation & Classification. Shobhanjana Kalita, Dept. of CSE, Tezpur University

CHAPTER 1 Introduction

Chapter 1 : Introduction

Welcome to COS151! 1.1

History of Computing. Slides from NYU and Georgia Tech

Chronological History of the Pre-Computer Developments

CS 31: Intro to Systems Digital Logic. Kevin Webb Swarthmore College February 2, 2016

what operations can it perform? how does it perform them? on what kind of data? where are instructions and data stored?

Chapter 1. The Big Picture

Computer Architecture

Chapter 1: An Introduction to Computer Science. Invitation to Computer Science, C++ Version, 6-th Edition

Fundamental concepts of Information Technology

History. 3rd Generation- Integrated Circuits, Transistors (Integrated Circuit form) for Memory ( memory is now volatile), Terminal/Keyboard for I/O

Computer Architecture Review. ICS332 - Spring 2016 Operating Systems

The Generations of Computers

Von Neumann Architecture

Computer & Microprocessor Architecture HCA103

Introduction To Computers. About the Course

time step first instruction second instruction

CS 31: Intro to Systems Digital Logic. Kevin Webb Swarthmore College February 3, 2015

History of Computers. What Is A Computer? Egyptian Numbers. Ancient Tools for Computation. Introduction to Computers

Machine Architecture and Number Systems

Systems Architecture

Copyright 2012 Pearson Education, Inc. Publishing as Prentice Hall

Machine Architecture and Number Systems CMSC104. Von Neumann Machine. Major Computer Components. Schematic Diagram of a Computer. First Computer?

Chapter 9: A Closer Look at System Hardware

Chapter 9: A Closer Look at System Hardware 4

Advanced Parallel Architecture Lesson 3. Annalisa Massini /2015

COSC 122 Computer Fluency. Computer Organization. Dr. Ramon Lawrence University of British Columbia Okanagan

Microprocessor. Dr. Rabie A. Ramadan. Al-Azhar University Lecture 1

Segment 1A. Introduction to Microcomputer and Microprocessor

Computer Systems. Binary Representation. Binary Representation. Logical Computation: Boolean Algebra

The Stored Program Computer

Charles Babbage, a Mathematics professor from England, is recognized as the Father of the Computer. He was born in 1791, and began designing the

Processor: Faster and Faster

Chapter 5: Computer Systems Organization. Invitation to Computer Science, C++ Version, Third Edition

Chapter 5: Computer Systems Organization

INTEL Architectures GOPALAKRISHNAN IYER FALL 2009 ELEC : Computer Architecture and Design

Part (01) Introduction to Computer

Chapter 5 12/2/2013. Objectives. Computer Systems Organization. Objectives. Objectives (continued) Introduction. INVITATION TO Computer Science 1

Fundamentals of Digital Computers The mechanical computer age began with the advent of the abacus in 500 B.C by Babylonians. The abacus, which was

Unit 4: Emerging Technologies. A History of Personal Computing by Mrs. Ogletree

Algorithm: Program: Programming: Software: Hardware:

The trusted, student-friendly online reference tool. Name: Date:

Describe the layers of a computer system

SYSTEM BUS AND MOCROPROCESSORS HISTORY

ASSEMBLY LANGUAGE MACHINE ORGANIZATION

Fundamentals of Python: First Programs. Chapter 1: Introduction Modifications by Mr. Dave Clausen

COMP 102: Computers and Computing Lecture 1: Introduction!

HISTORY OF COMPUTING

Early Calculating Tools

Great Inventions written by Bob Barton

Information Science 1

Computer Architecture. Prologue. Topics Computer Architecture. Computer Organization. Organization vs. Architecture. History of Computers

A Brief History of Computer Science. David Greenstein Monta Vista High School, Cupertino, CA

CS 101, Mock Computer Architecture

Microcomputer Architecture and Programming

Artificial Intelligence in the World. Prof. Levy Fromm Institute Spring Session, 2017

Computer Organization

Babbage Analytical Machine

COMPUTER ORGANIZATION AND ARCHITECTURE

HISTORY OF CALCULATION. Evolution of Computation

COMP3221: Microprocessors and. and Embedded Systems. Instruction Set Architecture (ISA) What makes an ISA? #1: Memory Models. What makes an ISA?

Von Neumann architecture. The first computers used a single fixed program (like a numeric calculator).

CS 105 Review Questions #3

A (BRIEF) HISTORY OF COMPUTING. By Dane Paschal

CPU ARCHITECTURE. QUESTION 1 Explain how the width of the data bus and system clock speed affect the performance of a computer system.

CS101 Lecture 29: Brief History of Computing

Welcome to COSC Introduction to Computer Science

HISTORY OF COMPUTERS HISTORY OF COMPUTERS. Mesleki İngilizce - Technical English. Punch Card. Digital Data. II Prof. Dr. Nizamettin AYDIN.

EEM336 Microprocessors I. Introduction to the Microprocessor and Computer

Who Am I? Computer Science 221. Goals for the Course. More About the Course. Administrivia. Other Materials. assignments 25% midterm 25% final 50%

Advanced Microprocessors

CS101 Lecture 25: The Machinery of Computation: Computer Architecture. John Magee 29 July 2013 Some material copyright Jones and Bartlett

7/28/ Prentice-Hall, Inc Prentice-Hall, Inc Prentice-Hall, Inc Prentice-Hall, Inc Prentice-Hall, Inc.

Lecture 1 Introduction to Microprocessors

von Neumann Architecture Basic Computer System Early Computers Microprocessor Reading Assignment An Introduction to Computer Architecture

Basic Computer System. von Neumann Architecture. Reading Assignment. An Introduction to Computer Architecture. EEL 4744C: Microprocessor Applications

HIGHER SECONDARY FIRST YEAR 2 MARK & 5 MARK NOTES CHAPTER 1 1. INTRODUCTION TO COMPUTER

Figure 1-1. A multilevel machine.

Introduction. Computer System Organization. Languages, Levels, Virtual Machines. A multilevel machine. Sarjana Magister Program

Computer Organization

Processors. Nils Jansen and Kasper Brink. (based on slides from Jeroen Keiren, Marc Seutter and David N. Jansen)

Processing Unit CS206T

Chapter 01. Introduction 2018/9/17. Chapter Goals. Computing Systems. Computing Systems. Layers of a Computing System. Abstraction


Transcription:

COMS 1003 Fall 2005 Introduction to Computer Programming in C History & Computer Organization September 15 th

What's Ahead Some computer history Introduction to major players in the development of hardware and theory Introduction to basic computer organization How is a computer structured? What are the components of its makeup? How does this piece of hardware actually calculate anything? How do we talk to computers?

Some Thoughts Computer Science is no more about computers than astronomy is about telescopes. The question of whether computers can think is the question of whether submarines can swim. E. Dijkstra

Input, Process, Output The most basic and ubiquitous model of computation Declare some data (variables) Operate on that data Return the result function analogy

Layers Human intentions Speaking machine is tedious & error prone, but fun for a limited number of people Need to translate from one level (language) to the next until we get to meaningful strings of bits

Classic Calculators Blaise Pascal (1642) at the age of 19, built a mechanical device that could add and subtract Programming language named Pascal (by N. Worth) 1672, Leibniz built a machine that could do multiplication and division as well Late 1930's: Konrad Zuse Atanasoff, Stibbitz also built calculators Howard Aiken (discovers the work of Babbage)

Charles Babbage (1791) 1819: calculation of log tables leads to Difference Engine Small D.E. (one program) Plans to build Analytical Engine (the first real computer) Completes design, Ada Lovelace analyzes and writes code for it World's first programmer is a woman Also has a language (Ada) named after her

Analytical Engine Parts: The store (memory) The mill (cpu) The control (control unit, OS, a loom) The input (punch cards) The output Never built

Theory What is an algorithm? No precise definition until David Hilbert's talk in 1900 23 unsolved problems 10 th problem: devise a procedure for showing that a given polynomial has an integral root in a finite number of steps Without a clear definition of algorithm, this question is unanswerable

An Algorithm Is: 1936: The Church Turing thesis Alonzo Church and Alan Turing Lambda calculus and Turing machines Anything that can be computed can be computed by a Turing machine Some things are not computable (decidable): the Halting Problem 1970: No algorithm exists for Hilbert's 10th problem

WW2: Enigma & COLOSSUS COLOSSUS world's first digital computer John Mauchley (who saw Sibbitz's work) built ENIAC, inspired John von Neumann (more on him later) Turing is the father of modern computing So, a plausible definition of algorithm is: a welldefined, precise definition of a procedure or set of steps for computing the result of a problem statement, but formally, an algorithm is...

Turing Machines Formal description of a general purpose computing device input/output tape (memory & I/O) Set of states (control) Input alphabet (language it understands) Transition table (actions+input move to new state) Start, accept, reject states Cannot compute certain problems, others take too long

The Stored Program Computer John von Neumann (genius) An architecture for computer hardware and software interaction Key idea: hardware is reusable, so store new software instructions and data for each new task Structure: CPU (central processing unit) with ALU and CU Primary memory (RAM) Input & output devices

Transistors, 1947 Very early computers used vacuum tubes Unreliable, bulky, used a lot of power William Shockley, John Bardeen, and Walter Brattain invent and construct the first reliable transistor at Bell Labs (Nobel Prize 1956) Shockly subsequently designs a superior transistor Shockly moves to California and founds Shockly Semiconductor, brings in 8 most talented engineers

Birth of an Industry Shockley hard to work with Robert Noyce and other scientists (Gordon Moore) left and form Fairchild Semiconductor Invents the silicon integrated chip Moore and Noyce leave and form Intel Ted Hoff (Intel) invents microprocessor As time goes by, circuits get faster, and more transistors fit on a chip, giving rise to Moore's Law

Back to von Neumann Computer Memory Contiguous array of numbered (addressable) bytes Stores instructions (a program) and data CPU (Control unit & ALU) I/O devices Device controllers, bus, interrupts vs. polling All connected by a bus (this adds complexity) Data path must be fast

What is Computer Memory? Memories consist of cells that can store a piece of information. Each cell has an address. A large collection of similar circuits. A flip flop made of a few gates creates a feedback loop that holds the value previously stored (1 bit) storage area (a sequence of binary strings) All cells are the same size 4 registers MAR, MBR, MDR, cmd register

Memory (cont.) Most personal computer today are 32 bit machines. 32 bits define the address length; that is, the number of bits used to uniquely determine which memory location we're talking about. How many bytes is 32 bits? How many integers can we form from a 32 bit binary number? So how much memory can a 32 bit machine address?

Endianess What order is data stored in memory? Intel vs Mac Little endian v Big endian 3210 v 0123

What's In a CPU? Control Unit Arithmetic Logic Unit (ALU) Registers (a CPU has its own very fast memory) Some registers are dedicated: PC (program counter) IR (instruction register) MAR (memory address register), MBR, MDR AC (accumulator) Status register

Devices Computation and binary digits may be interesting, but I/O is what makes computers useful. Devices are common: Mouse, keyboard, scanner, monitor, speakers, sound card, modem, network card, graphics card Devices must communicate with memory and the CPU to transfer data

Device Communication Polling CPU takes control of the bus and asks each device in turn if it has data Simple, but inefficient, wastes CPU and may drop data Interrupts Device informs CPU that it has data

Device Interface Device controller Built to hardware interface specification or industry standard Device Driver Code that the computer uses to talk to the device's hardware

Machine Language Basic machine instructions: Load, store, add, multiply, and, or, xor, shift For human convenience only Assembly language invented to bridge gap between machine language and human thoughts Essentially translates to numbers, or strings of binary digits Binary digits act as signals to the hardware

Fetch, Decode, Execute Fetch instruction pointed to by PC PC = PC + 1 (point to next instruction, +4) Decode instruction (string of bits has context) Fetch any data that the instruction uses Execute the instruction (put the bits on the wire) Check for interrupts Goto step 1

The Digital Logic Level How gates are constructed How an ALU is made of gates How flip flops make a memory bit The Clock (a number that people use to sell computers)

RISC vs CISC 1970s, creation of micro code to interpret lowlevel instructions instead of executing in hardware Many architectures (notably the Intel x86) had a legacy software base to support Computer Instruction Sets were becoming increasingly complex: one huge instruction for balancing books and issuing paychecks A revolution was brewing

RISC vs. CISC Reduced Instruction Set Computer Small instructions that execute in 1 cycle (1 clock tick) Direct hardware execution Issue many instructions at once Complex Instruction Set Computer Large instruction set Large complex instructions with many addressing modes Micro code interpretation

More RISC Speed up the decoding process (few, small, simple standard instruction formats) Instruction Formats Only 2 operations reference memory Many registers. More registers. Tons of registers. Instruction Level Parallelism (ILP) Pipeline & stages Problems with the pipeline

RISC mows down CISC and Truman lost to Dewey Why? Legacy base (just like a lot of CS) CISC computers successfully incorporated RISC cores while keeping complex instructions CISC got pipelined (Pentium)

Summary Abstract description of computation leads to mechanical and electronic devices that can carry out this procedure Requires great advances in physics and engineering Von Neumann proposes architecture for changing the program that the hardware machine runs Fetch decode execute

More Summary Digital logic Circuits Memory ALU composistion Addressing memory Machine Layers CISC vs RISC pipelines