COMS 1003 Fall 2005 Introduction to Computer Programming in C History & Computer Organization September 15 th
What's Ahead Some computer history Introduction to major players in the development of hardware and theory Introduction to basic computer organization How is a computer structured? What are the components of its makeup? How does this piece of hardware actually calculate anything? How do we talk to computers?
Some Thoughts Computer Science is no more about computers than astronomy is about telescopes. The question of whether computers can think is the question of whether submarines can swim. E. Dijkstra
Input, Process, Output The most basic and ubiquitous model of computation Declare some data (variables) Operate on that data Return the result function analogy
Layers Human intentions Speaking machine is tedious & error prone, but fun for a limited number of people Need to translate from one level (language) to the next until we get to meaningful strings of bits
Classic Calculators Blaise Pascal (1642) at the age of 19, built a mechanical device that could add and subtract Programming language named Pascal (by N. Worth) 1672, Leibniz built a machine that could do multiplication and division as well Late 1930's: Konrad Zuse Atanasoff, Stibbitz also built calculators Howard Aiken (discovers the work of Babbage)
Charles Babbage (1791) 1819: calculation of log tables leads to Difference Engine Small D.E. (one program) Plans to build Analytical Engine (the first real computer) Completes design, Ada Lovelace analyzes and writes code for it World's first programmer is a woman Also has a language (Ada) named after her
Analytical Engine Parts: The store (memory) The mill (cpu) The control (control unit, OS, a loom) The input (punch cards) The output Never built
Theory What is an algorithm? No precise definition until David Hilbert's talk in 1900 23 unsolved problems 10 th problem: devise a procedure for showing that a given polynomial has an integral root in a finite number of steps Without a clear definition of algorithm, this question is unanswerable
An Algorithm Is: 1936: The Church Turing thesis Alonzo Church and Alan Turing Lambda calculus and Turing machines Anything that can be computed can be computed by a Turing machine Some things are not computable (decidable): the Halting Problem 1970: No algorithm exists for Hilbert's 10th problem
WW2: Enigma & COLOSSUS COLOSSUS world's first digital computer John Mauchley (who saw Sibbitz's work) built ENIAC, inspired John von Neumann (more on him later) Turing is the father of modern computing So, a plausible definition of algorithm is: a welldefined, precise definition of a procedure or set of steps for computing the result of a problem statement, but formally, an algorithm is...
Turing Machines Formal description of a general purpose computing device input/output tape (memory & I/O) Set of states (control) Input alphabet (language it understands) Transition table (actions+input move to new state) Start, accept, reject states Cannot compute certain problems, others take too long
The Stored Program Computer John von Neumann (genius) An architecture for computer hardware and software interaction Key idea: hardware is reusable, so store new software instructions and data for each new task Structure: CPU (central processing unit) with ALU and CU Primary memory (RAM) Input & output devices
Transistors, 1947 Very early computers used vacuum tubes Unreliable, bulky, used a lot of power William Shockley, John Bardeen, and Walter Brattain invent and construct the first reliable transistor at Bell Labs (Nobel Prize 1956) Shockly subsequently designs a superior transistor Shockly moves to California and founds Shockly Semiconductor, brings in 8 most talented engineers
Birth of an Industry Shockley hard to work with Robert Noyce and other scientists (Gordon Moore) left and form Fairchild Semiconductor Invents the silicon integrated chip Moore and Noyce leave and form Intel Ted Hoff (Intel) invents microprocessor As time goes by, circuits get faster, and more transistors fit on a chip, giving rise to Moore's Law
Back to von Neumann Computer Memory Contiguous array of numbered (addressable) bytes Stores instructions (a program) and data CPU (Control unit & ALU) I/O devices Device controllers, bus, interrupts vs. polling All connected by a bus (this adds complexity) Data path must be fast
What is Computer Memory? Memories consist of cells that can store a piece of information. Each cell has an address. A large collection of similar circuits. A flip flop made of a few gates creates a feedback loop that holds the value previously stored (1 bit) storage area (a sequence of binary strings) All cells are the same size 4 registers MAR, MBR, MDR, cmd register
Memory (cont.) Most personal computer today are 32 bit machines. 32 bits define the address length; that is, the number of bits used to uniquely determine which memory location we're talking about. How many bytes is 32 bits? How many integers can we form from a 32 bit binary number? So how much memory can a 32 bit machine address?
Endianess What order is data stored in memory? Intel vs Mac Little endian v Big endian 3210 v 0123
What's In a CPU? Control Unit Arithmetic Logic Unit (ALU) Registers (a CPU has its own very fast memory) Some registers are dedicated: PC (program counter) IR (instruction register) MAR (memory address register), MBR, MDR AC (accumulator) Status register
Devices Computation and binary digits may be interesting, but I/O is what makes computers useful. Devices are common: Mouse, keyboard, scanner, monitor, speakers, sound card, modem, network card, graphics card Devices must communicate with memory and the CPU to transfer data
Device Communication Polling CPU takes control of the bus and asks each device in turn if it has data Simple, but inefficient, wastes CPU and may drop data Interrupts Device informs CPU that it has data
Device Interface Device controller Built to hardware interface specification or industry standard Device Driver Code that the computer uses to talk to the device's hardware
Machine Language Basic machine instructions: Load, store, add, multiply, and, or, xor, shift For human convenience only Assembly language invented to bridge gap between machine language and human thoughts Essentially translates to numbers, or strings of binary digits Binary digits act as signals to the hardware
Fetch, Decode, Execute Fetch instruction pointed to by PC PC = PC + 1 (point to next instruction, +4) Decode instruction (string of bits has context) Fetch any data that the instruction uses Execute the instruction (put the bits on the wire) Check for interrupts Goto step 1
The Digital Logic Level How gates are constructed How an ALU is made of gates How flip flops make a memory bit The Clock (a number that people use to sell computers)
RISC vs CISC 1970s, creation of micro code to interpret lowlevel instructions instead of executing in hardware Many architectures (notably the Intel x86) had a legacy software base to support Computer Instruction Sets were becoming increasingly complex: one huge instruction for balancing books and issuing paychecks A revolution was brewing
RISC vs. CISC Reduced Instruction Set Computer Small instructions that execute in 1 cycle (1 clock tick) Direct hardware execution Issue many instructions at once Complex Instruction Set Computer Large instruction set Large complex instructions with many addressing modes Micro code interpretation
More RISC Speed up the decoding process (few, small, simple standard instruction formats) Instruction Formats Only 2 operations reference memory Many registers. More registers. Tons of registers. Instruction Level Parallelism (ILP) Pipeline & stages Problems with the pipeline
RISC mows down CISC and Truman lost to Dewey Why? Legacy base (just like a lot of CS) CISC computers successfully incorporated RISC cores while keeping complex instructions CISC got pipelined (Pentium)
Summary Abstract description of computation leads to mechanical and electronic devices that can carry out this procedure Requires great advances in physics and engineering Von Neumann proposes architecture for changing the program that the hardware machine runs Fetch decode execute
More Summary Digital logic Circuits Memory ALU composistion Addressing memory Machine Layers CISC vs RISC pipelines